Our J-PET scanner has been one of three main projects for whole-body PET imaging during Total Body PET – From Mice to Men conference.
The contribution of our FPGA-FAIS group was in a form of a poster that highlights main ideas for innovative tomographic data processing. As no other system is doing high-level analysis of this sort on the level of FPGAs we caught a lot of interest.
You can check out the poster under this link.
We are pleased to announce that we have fixed main details of the III Edition of The Symposium on Programmable Logic Devices.
For more details check the details page.
We will keep updating the program, register soon to receive updates or subscribe to the news feed on the left panel.
It was a great pleasure for me to attend Summer XLII-nd IEEE-SPIE Joint Symposium on Photonics, Web Engineering, Electronics for Astronomy and High Energy Physics Experiments presenting DAQ system of PET tomography devices developed by our team.
The conference takes place in Warsaw University of Technology Wilga Village, where attendees are accommodated. Quiet and full of nature terrains aids concentration and provides space to relax and chill out after scientific activities.
Next Tuesday (22 May, 16:00, B-2-50):
- Karol Farbaniec will report on Digilent Design Contest
- Maciej Bendec will report on his research on Neural Networks implementation on FPGAs
- We’ll discuss creation of IEEE Student Branch FPGA and Networking
- We’ll discuss schedule and details of the III Symposium
It is now possible to get a real-time view at our scanner visualization in your browser (if you are inside department local network).
The most interesting thing is how the image gets there:
- Data from digitizers is received by the J-PET Controller FPGA
- Lines-Of-Response are reconstructed in real-time
- Regions-Of-Response are calculated into X,Y,Z annihilation coordinates
- Stream of reconstructed points is forwarded from programmable logic through Xillybus to the shared DDR memory
- Petalinux with Ubuntu running on integrated ARM cores accesses the memory and streams the points to your browser by a NodeJS server
It is a next step in the integration of image reconstruction process within the FPGA SoC device. Now, we have a framework on which we can implement further data processing in software.
We are thankful to Bartłomiej Flak and Dr Paweł Rajda from AGH Kraków for their contribution to the J-PET project.
We had a great pleasure to participate in Digilent Design Contest 2018 finals. Competition was strong this year. Team from Serbia won first prize with their Deep Neural Network Hardware Accelerator.
On the podium was also one team from Romania and two from Poland.
Additive Synthesizer – Gdansk University of Technology
was granted with Digilent special prize for best usage of Digilent Instruments.
The event lasts for two days. On Saturday all finalists were presenting their work. On Sunday only few teams were invited for showing in details their solutions.
Most of submissions for the contest were diploma thesis or long-term projects conducted by experienced engineers.
Our approach for competition was to learn a lot and explore different approaches to Augmented Reality on SoC devices. In my opinion it is not necessarily important to win, but to compete, cooperate and learn state-of-the-art techniques and methods. We draw conclusions and got valuable feedback from community.
We are glad to inform that our manuscript “Evaluation of single-chip, real-time tomographic data processing on FPGA SoC devices” has been accepted for publication in IEEE Transactions on Medical Imaging!
We are the first to have developed a compact and integrated solution for high-level PET processing on FPGAs.
It is a significant confirmation that out work is appreciated by the community. We have much more to come in that matter!
You can check it out here.