Tag Archives: Projects

PANDA Straws and DAQ system under beam

February was a month of very intensive work to prepare our straw detector and Data Acquisition System for tests with proton beam from COSY accelerator at Juelich Forschungszentrum in Germany.

Together with 5 other groups we had granted one week of beamtime to evaluate the detectors, electronics and software.

It was the first time we evaluated operation of the entire, small scale detector system for PANDA experiment. Three detector subsystems: Forward Tracker, Electromagnetic Calorimeter and Time-of-Flight, each with their own readout system, were synchronized with SODANet system and generated data was processed by a set of 3 Compute Node modules for burst building and preliminary preprocessing.

It was also the possibility to test the data preprocessing system based on Xilinx ZCU102 platform. The board receives data streams from the digitizing boards and recovers track candidates, rejecting empty events.

Digilent Design Contest 2018 – Finals

We had a great pleasure to participate in Digilent Design Contest 2018 finals. Competition was strong this year. Team from Serbia won first prize with their Deep Neural Network Hardware Accelerator.

On the podium was also one team from Romania and two from Poland.

Additive SynthesizerGdansk University of Technology
was granted with Digilent special prize for best usage of Digilent Instruments.

The event lasts for two days. On Saturday all finalists were presenting their work. On Sunday only few teams were invited for showing in details their solutions.

Most of submissions for the contest were diploma thesis or long-term projects conducted by experienced engineers.

Our approach for competition was to learn a lot and explore different approaches to Augmented Reality on SoC devices. In my opinion it is not necessarily important to win, but to compete, cooperate and learn state-of-the-art techniques and methods. We draw conclusions and got valuable feedback from community.

 

 

First 3D reconstruction

Enhanced image reconstruction, including 3D and TOF functionalities has been successfully implemented entirely in the FPGA!

In programmable logic, we are finding LOR candidates and reconstruct the annihilation point coordinates. Then, only X, Y, Z values are being sent from the JPET Controller to the server that produces 3D canvas with the scanner visualization.

You can find a video showing it in action under [this] link.

 

Real time image reconstruction

Another step into tomographic image reconstruction in real time has been made!

JPET Controller allows to process data from 8 TRBv3s in several steps leading to image creation:

  • Receive and synchronize data units from the TRBv3s
  • Hit data extraction
  • Detector geometry mapping
  • Coincidence search
  • LOR coordinates calculation
  • Data transmission

All those steps, performed 50 000 times per second, processing hundreds MB per second reduce the data volume to hundreds of KB and limit the processing on the CPU only to drawing points. All this on a single Xilinx Zynq.

Next steps are:

  • Introduction of calibration parameters
  • Time-Of-Flight
  • 3rd dimension Z-Axis

Under this [link] you can find a video that shows reconstructed image being drawn in real time as the radioactive source on robotic arm scans the detector.

Medical Image Reconstruction

First images have been produced by JPET Controller board!

The controller processes data streams from 8 TRBs, parses the TDC data and recovers hits on scintillators. The hits are correlated together by scanning with a time window and then mapped into the detector geometry in order to recover LOR coordinates. Finally instead of raw TDC data only two points from LOR are being sent.