Codeathon winners demonstrate open-science approaches with SPARC data and resources

Showcasing their problem-solving methodologies, codeathon teams presented their innovative solutions to demonstrate and expand the utility and interoperability of the SPARC Portal.

Showcasing their problem-solving methodologies, codeathon teams presented their innovative solutions to demonstrate and expand the utility and interoperability of the SPARC Portal. The third annual global SPARC FAIR Codeathon was a three day in-the-cloud event that ran August 10-12th, 2024. The SPARC Data and Resource Center, with support from the National Institutes of Health Stimulating Peripheral Activity to Relieve Conditions (SPARC) program, offered a total prize pool of US$50,000 -- including cash prizes and manuscript publication fees. Each team had access to various SPARC experts to help with technical advice as needed to turn the team idea into a working prototype during the three day-in-the-cloud event.

Each project was judged on the quality and completeness of the code, documentation, and the recorded presentation that summarized their project for the judging panel. Judging criteria additionally included creativity of team solution, the impact on the SPARC research community, and feasibility based on prototype implementation.

We are grateful to the following panel of judges for their diligence and time in selecting the winners: Andrea Zirn, Bing Si, Mauricio Rangel Gomez, Michael Ojiere, Rajanikanth Vadigepalli, Stephen Brooks, and Udana Torian.

And the winners are!

The 3rd Prize US$3,000 goes to SPARCTA, a SPARC Tiff Annotator Team members include Haries Ramdhani, Anmol Kiran, Archit Akram.

The 2nd Prize US$7,000 goes to SPARC-SPy, A cross-platform Python visualisation tool called the SPARC Scaffolding in Python (SPARC-SPy) to run within o²S²PARC that can produce VTK visualisations from data scaffolds. Team members include Michael Hoffman, Yun Gu, Mishaim Malik, Savindi Wijenayaka, Matthew French.

In an unexpected turn of events, there is a tie between grand prize winners.

US$15,000 goes to SPARC.RL, Reinforcement Learning for Medical Device Control Made Easy. Team members include Max Haberbusch and John Bentley.

US$15,000 goes to oSPARC-Hub, A tool for Fast and Small savable workflows for SPARC data analysis. Team members include Shailesh Appukuttan, Hiba Ben Aribi, Fynn Rievers.

The winners were showcased in a webinar which allowed the teams to present their projects and solutions. With pleasure, we invite you all to watch the SPARC Codeathon Winners Announcement at the following link https://www.youtube.com/watch?v=Q8DenyjSA6k. You can also find all previous SPARC Codeathon projects on the SPARC YouTube channel and the SPARC Portal. Judging is always difficult and we wish to thank all individuals for participating and contributing their ideas to the 2024 SPARC FAIR Codeathon.

What is SPARC?

Data, simulations, and maps are published on the SPARC Portal, an open-source platform for finding, exploring, visualizing, interacting with and accessing openly shared data and associated computational models and analyses. The SPARC Portal seeks to provide a scientific and technological foundation for future bioelectronic medicine, devices, and protocols for conditions of the peripheral nervous system through open-science principles incorporating published experimental data, simulations, and maps.

Explore over 300 highly-curated experiments, protocols, models, maps, and simulations on https://sparc.science.

                          SPARC.SCIENCE

The SPARC Portal was created to help facilitate autonomic neuroscience research and advance the efficacy of bioelectronic medicine by hosting a growing collection of digital resources, including datasets, maps, and computational studies that focus on the role of the peripheral nervous system and organ function. https://sparc.science

The resources available on the SPARC Portal are generated by SPARC program-supported research projects, as well as from consortia the SPARC DRC supports. We are continually adding new datasets, maps, and computational studies to the site and are building new functionality to access and interact with those resources.

View All News >