Artificial Intelligence and Consciousness - Part 2

An Artificial Neural Network, the primary constituent of Machine learning and Deep Learning, consists of the neurons, i.e. nodes grouped into Input Layer, Hidden Layer and the Output. The input layer provides the problem while the output layer classifies or finds the probability (like detecting the correct image or recognizing faces or texts). It is the hidden layer that is most interesting. The hidden layers are used in feature engineering to detect more and more features progressively as we move towards the output. This is done through  processes called forward propagation and back propagation. Simply put, the layers are connected to each other and provide input to each other and each input being associated with a suitable weight and accompanied by a bias. The weights are first chosen randomly and then the network is trained on a set of data. More the training data the better it is, as it may minimize the problem of overfitting esp. when there are too many features. The input signal undergoes a transformation using a mathematical function called activation function. As part of the training in a process called supervised learning the outputs are compared against the intended output (e.g the true image classification values in case of image detection) and a loss function is computed. The less the loss, better is the performance of the network. If the loss is greater, the weights need to be adjusted in a process called backpropagation until the losses are minimized (by a process called gradient descent - the derivatives or the gradient of the parameters decide how the loss function is moving - it is increasing or decreasing). If the loss function is minimized we have achieved our objective but then it may be a local minima or a case of overfitting and hence the algorithm has to be tested against a Test Data to see its performance. Even a validations set is considered before using test data to validate training findings. This is the basic function of every artificial neural network or ANN and the advanced versions of them like the deep networks (more hidden layers), the convolutional neural networks (having sliding windows like max pool, filters and fully connected networks in alternative settings to ensure feature reduction). The back end mathematics involves complex matrix and vector operations (linear algebra), and calculus of activation functions (Sigmoid, tanh, ReLU, leaky ReLU etc.)
However many pundits are in agreement in one aspect - what happens in hidden layers is still not very clear to anybody. This is indeed spooky. We have a whole gamut of feature engineering resulting in resolving advanced problems like computer vision, self drive cars, speech recognition, text recognition and so on, but the underlying process is not clear. Are these networks really thinking like human brains? Are these becoming more and more intelligent on their own irrespective of the complexities? We have now millions of features that can be handled by these deep networks and programming them is becoming easier using sophisticated python and other programs and libraries and platforms like Keras. But what is it that is driving intelligence? Is our human brain also networked to handle input matrices and the electric impulses transmitted do some kind of advanced feature engineering to derive near perfect output? This is some thing that is unexplained and unexplored. Also how are all these related to consciousness? Is an advanced deep network conscious? Or is consciousness a product of a deep and complex neural network in human and other animal brains? Let is discuss that in another piece of writing. But this is an interesting frontier where philosophy, science, technology all converge and create a confluence of ideas.

Comments

Popular posts from this blog

Similarities between German and Sanskrit

Oi Mahamanab Ase - Netaji's Subhas Chandra Bose's after life and activities Part 1

Swami Vivekananda and Sudra Jagaran or the Awakening of the masses - His visions for a future world order - Part 1