This piece is in continuation to the previous blog on the OpenCog and AGI. In this piece we will understand a core element of the OpenCog system i.e. the Atomspace which is the information storage and retrieval system used in the framework. It is based on the concept of hypergraphs, something that we will look into and develop a understanding of.

Let’s us first lay down a foundation for the graph database concepts, since those will be used to define the high level abstract concepts for the hyper-graphs.

A graph database is a database consisting of set objects, which can…

While we think, it's about using networks to solve worldly problems and call it AI. Extrapolating the networks of networks to solve all the problems in the world is an endless sheet which humans will keep on stitching since there is no upper bound on the intelligence, something which has been thoroughly explained in Francois Chollet’s paper of measure of the intelligence. I believe that while the neural networks work well on solving some of the tasks in our daily lives using computing capabilities, there is more than just matrix multiplication in our cognitive system which makes us think different…

Well, I wouldn’t have been writing this article if evolution wouldn’t have been a phenomenon that we as a human civilisation would have experienced and found in the nature. I believe the most sophisticated algorithm on the planet is evolution which has no defined objective but still is progressing and creating novel creatures and bringing novel changes in the ecosystems. Something which can be related to the work of Kenneth Stanely’s open endedness Algorithm, the POETs.

On the same note, we have Neural evolution theorised too. How the internal model of the brain evolves with environment interactions and learns to…

Continuing on the paradigm of the Neuro mechanics functioning theories of the brain. Another star in the universe twinkles quite bright, the cell assembly and correlation theory. It was proposed by Donald O Hebb.

Synaptic plasticity basically says if axon of cell A is near enouth to excite cell B and persistently fires it, some growth process or metabolic change takes place in one or both cell such that A’s efficiency, as one of the cells firing B, is increased. **“Fire together, wire together”.**

The cell assembly theory posits that groups of interconnected neurons are formed through a strengthening of…

Continuing in the paradigm of the theories(refer to previous blogs on the same Free energy principle and Bayesian Brain hypothesis) in concern with the brain’s neurological functioning. A theory right which speaks for a special case of free energy principle is the **infomax principle.**

**Infomax Principle**

The principle optimises the mutual information between the sensory states and the internal representation these sensory states by the internal model of the outer world by the brain under the constraint on the efficiency of these internal representation.

So firstly, what is mutual information. To understand that we first need to understand the meaning…

Continuing on to another fascinating theory in the field of Neuroscience, **Bayesian Brain Hypothesis.** As we all are aware of the fact Bayesian Statistics in the Probabilistic methodology to update belief with evidence using conditional probability on your priors. It shouldn’t be a Brainer to expect a theory extrapolating the Bayesian belief system to our cognitive system.For those who need a background on what is Bayesian Statistics, please refer this link .So let’s get into the Bayesian Hypothesis and understand what exactly, it has to offer us.

It formulate perception as a constructive process based on internal or generative models.It…

Since the boom of the era of Artificial Intelligence I have come across a lot of Literature on the Perceptron based Deep Learning Architectures which speaks for how these models are learning the human cognition phenomenons using the mathematical approximation methods. I was not able to accept the fact that these mathematical approximations whose end result is pattern matching are human alias of cognitive systems. The work of Francois Chollet on Measure of intelligence and ARC challenge clearly states the fact that we are way too digressed from the path of the creation of the intelligent systems. While if you…

Convolution Neural networks are a breed of neural network-based representation learning architecture that used convolution operation to downsample a large N-dimensional feature map while keeping the important information intact in the low dimensional representations. I will not get into the details of what are CNNs, how do they work. What is convolution operation, what is max-pooling, average pooling, or what is global pooling? I expect the viewer of this article have a fair understanding of these concepts and part from that have a fair understanding of quantum arithmetics and quantum gates.

We will be using the cluster state as an…

In this second part, we will look into the fundamental details of the Hybrid Quantum classical machine learning.

Before you get into the details of this article, I would suggest you guys to brush up the fundamentals of quantum vector operations. You can refer to this awesome series on quantum computing to get started with.

- In general quantum neural networks can be expressed as the product of the parameterized unitary matrices. Samples i.e. data points or the data distributions and expectation values i.e. …

Let’s cut to the chase and get started with the quantum-based machine learning models coding ability provided by the TensorFlow. As we discussed in the previous blog about the NISQ technology and the library named CIRQ that was developed to build quantum circuits on the NISQ based hardware. TFQ has essentially been built with the amalgamation of the Tensorflow with Cirq to build hybrid- classic quantum models. We will get to this part in a short time. This piece will explain how TFQ constructs work and how to code your own TFQ based quantum machine learning model.

One key observation…

Homo Bayesian