Exploring numerical calculations with CalcNet

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

11 Downloads (Pure)


Neural networks are not great generalizers outside their training range i.e. they are good at capturing bias but might miss the overall concept. Important issues with neural networks is that when testing data goes outside training range they fail to predict accurate results. Hence, they loose the ability to generalize a concept. For systematic numeric exploration neural accumulators (NAC) and neural arithmetic logic unit(NALU) are proposed which performs excellent for simple arithmetic operations. But, major limitation with these units is that they can't handle complex mathematical operations \& equations. For example, NALU can predict accurate results for multiplication operation but not for factorial function which is essentially composition of multiplication operations only. It is unable to comprehend pattern behind an expression when composition of operations are involved. Hence, we propose a new neural network structure effectively which takes in complex compositional mathematical operations and produces best possible results with small NALU based neural networks as its pluggable modules which evaluates these expression at unitary level in a bottom-up manner. We call this effective neural network as CalcNet, as it helps in predicting accurate calculations for complex numerical expressions even for values that are out of training range. As part of our study we applied this network on numerically approximating complex equations, evaluating biquadratic equations and tested reusability of these modules. We arrived at far better generalizations for complex arithmetic extrapolation tasks as compare to both only NALU layer based neural networks and simple feed forward neural networks. Also, we achieved even better results for our golden ratio based modified NAC and NALU structures for both interpolating and extrapolating tasks in all evaluation experiments. Finally, from reusability standpoint this model demonstrate strong invariance for making predictions on different tasks.

Original languageEnglish
Title of host publication2019 IEEE 31st International Conference on Tools with Artificial Intelligence (ICTAI)
Number of pages6
ISBN (Electronic)978-1-7281-3798-8
Publication statusPublished - Feb 2020
MoE publication typeA4 Article in a conference publication
EventIEEE International Conference on Tools with Artificial Intelligence - Portland, United States
Duration: 4 Nov 20196 Nov 2019
Conference number: 31


ConferenceIEEE International Conference on Tools with Artificial Intelligence
Abbreviated titleICTAI
Country/TerritoryUnited States


  • Neural networks
  • Neural Arithmetic Logic Unit
  • Neural Accumulators
  • CalcNet


Dive into the research topics of 'Exploring numerical calculations with CalcNet'. Together they form a unique fingerprint.

Cite this