Development Environment Provides Complete Machine Learning Framework

BrainChip Holdings introduces the Akida Development Environment, a machine learning framework for the creation, training, and testing of spiking neural networks (SNNs), supporting the development of systems for edge and enterprise products on the Company’s Akida Neuromorphic System-on-Chip (NSoC). Applications include public safety, transportation, agricultural productivity, financial security, cybersecurity, and healthcare.

 

The Akida Development Environment includes the Akida Execution Engine, data-to-spike converters, and a model zoo of pre-created spiking neural network (SNN) models. The framework leverages the Python scripting language and its associated tools and libraries, including Jupyter notebooks, NumPy and Matplotlib.

Industry Event

Embedded Technologies Expo & Conference

Limited Passes Available - Register now and save with Early Bird Rates!

Embedded Technologies Expo & Conference is North America’s ONLY embedded event focused on helping design engineers through exhibits showcasing new and emerging products & solutions, education, training, and hands-on learning experiences. Join us in the heart of Silicon Valley - June 22-24, 2020 at the McEnery Convention Center in San Jose, CA. Use promo code FE100 for $100 off Conference Passes.

 

Execution Engine

 

The Akida Execution Engine is at the center of the framework and contains a software simulation of the Akida neuron, synapses, and the multiple supported training methodologies. Accessed through API calls in a Python script, users can specify their neural network topologies, training method, and datasets for execution. Based on the structure of the Akida neuron, the execution engine supports multiple training methods, including unsupervised training and unsupervised training with a labelled final layer.

 

Data-to-Spike Converters

 

Spiking neural networks work on spike patterns. The development environment natively accepts spiking data created by Dynamic Vision Sensors (DVS). However, there are many other types of data that can be used with SNNs. Embedded in the Akida Execution Engine are data-to-spike converters, which convert common data formats such as image information (pixels) into the spikes required for an SNN.

 

The development environment will initially ship with a pixel-to-spike data converter, to be followed by converters for audio and big data requirements in cybersecurity, financial information and the Internet-of-Things data. Users are also able to create their own proprietary data to spike converters to be used within the development environment.

 

Model Zoo

 

The Akida Development Environment includes pre-created SNN models. Currently available models include a multi-layer perceptron implementation for MNIST in DVS format, a 7-layer network optimized for the CIFAR-10 dataset, and a 22-layer network optimized for the ImageNet dataset. These models can be the basis for users to modify, or to create their own custom SNN models.

 

The Akida Development Environment is currently available on an early-access program to approved customers. For more information, visit the Akida Development Environment page.

Suggested Articles

Aside from good internet access, work-from-home amid COVID-19 sometimes boils down to celebrating our humanity.

Dutch companies invite hospitals to use AI tool on X-rays instead of CT scans

NPD Group says monitor sales doubled in the U.S. in the first half of March