Challenges rather than datasets
The International Centre for Neuromorphic Systems (ICNS) at Western Sydney University is actively exploring the application of neuromorphic systems to real-world problems. The use of static datasets for neuromorphic systems provides benchmarks that do not accurately reflect the nature, characteristics, and benefits of neuromorphic systems. We have therefore developed a robotic foosball table, instrumented such that one player is controlled by an electronic system and driven by neuromorphic systems. We envision this sort of well-defined and easily understandable task as a potential benchmark and community-building challenge.
Why robotic foosball isn't such a trivial idea
Neuromorphic systems function in an inherently different manner to conventional computation systems and therefore require a different means of benchmarking and evaluating performance. Given that most industry-standard datasets were created specifically to cater for conventional sensors and processing systems, these datasets are not directly applicable to neuromorphic systems, and particularly so for event-based systems. There have been numerous attempts to convert the widely available standard datasets to a neuromorphic-compatible format, and although these datasets have been extremely useful to the field, they are still fundamentally designed in terms of a static and usually frame-based paradigm.
The robotic foosball table is our showcase example of a new form of benchmarking and characterization, specifically designed for neuromorphic systems. Unlike conventional datasets, these benchmarks revolve around the use of dynamic and real-world tasks, which essentially fulfill the role of a dynamic dataset. In such tasks, the current sensing and decision-making can lead to actions that alter the approach to the task and future sensing or processing. For example, a dynamic camera that moves in response to what it has just seen.
Such active and responsive data cannot be captured in a static dataset, and therefore we are building a series of challenge tasks to create an environment for such evaluation. In fact, we believe that the lack of active and dynamic benchmarking is a significant factor holding back the field of neuromorphic systems. We are actively building such demonstrators for event-based systems, forefront amongst them is our robotic foosball table in which one player is entirely controlled by a robotic systems that can be driven through a very simple hardware or software interface. We are exploring other benchmarking tasks as well, and intend to use the same hardware and interface, allowing any work done on the foosball table to carry over to other tasks.
We chose foosball as the first task as it is easily understood by everyone, it is engaging, and it poses multiple levels of challenge and strategy. We demonstrated our first prototype of the robotic foosball table at the Telluride Neuromorphic and Cognition Workshop in 2018. This provided us with invaluable insight into the needs and requirements for such a system, and as a result we have refined the mechanical design and completely overhauling the electronics and hardware interface.
We intend to build a community around the foosball table challenge. The underlying idea behind a real-world task, such as the foosball table, is that it is intended to provide a benchmark task based on a high-level task, rather than a ground-truth accuracy. We envision multiple algorithms and approaches competing with one another. As the task is well defined, the best robot (or neuromorphic system) is simply the one that plays the game better.
Media Coverage of our Robotic Foosball Table
- Intel: Opinion: Intel’s Mike Davies Discusses Need for Neuromorphic Computing Benchmarks in Nature
- ZDNET: UWS uses Intel chips to create biologically-engineered foosball table
- Robotic Business Review: Intel Makes Neuromorphic Chips Available to Researchers
- Techradar: Intel’s new AI chips are 1000x faster than CPUs but there’s a catch