Neural Network Syntax Analyzer For Embedded Standardized Deep Learning

Keywords

Deep learning; Machine learning; Neural network unified format; Standardization; TensorFlow

Abstract

Deep learning frameworks based on the neural network model have attracted a lot of attention recently for their potential in various applications. Accordingly, recent developments in the fields of deep learning configuration platforms have led to renewed interests in neural network unified format (NNUF) for standardized deep learning computation. The attempt of making NNUF becomes quite challenging because primarily used platforms change over time and the structures of deep learning computation models are continuously evolving. This paper presents the design and implementation of a parser of NNUF for standardized deep learning computation. We call the platform implemented with the neural network exchange framework (NNEF) standard as the NNUF. This framework provides platform-independent processes for configuring and training deep learning neural networks, where the independence is offered by the NNUF model. This model allows us to configure all components of neural network graphs. Our framework also allows the resulting graph to be easily shared with other platform-dependent descriptions which configure various neural network architectures in their own ways. This paper presents the details of the parser design, JavaCC-based implementation, and initial results.

Publication Date

6-15-2018

Publication Title

EMDL 2018 - Proceedings of the 2018 International Workshop on Embedded and Mobile Deep Learning

Number of Pages

37-41

Document Type

Article; Proceedings Paper

Personal Identifier

scopus

DOI Link

https://doi.org/10.1145/3212725.3212727

Socpus ID

85058212919 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/85058212919

This document is currently not available here.

Share

COinS