SENSOR FUSION: THE CONVOLUTED SPECTER

Original article was published on Deep Learning on Medium


SENSOR FUSION: THE CONVOLUTED SPECTER

AUTONOMOUS Vehicles have been on the technological development curve for the last decade and a half; But their commercialisation is nowhere on the horizon. With every ‘Eureka’ moment, there are exponential dynamic uncertainties to contend. The militaries have embarked on the scary journey of their own with development of Autonomous Weapons as a natural progression in the canvas of war fighting with Deep Logic Algorithms to spin their web.

Typical PCB housing Deep Logic Gates and Embedded Algorithms

With the availability of greater bandwidths to play around with and 5G networks in the offing, the transmission and receiving speeds will undergo further changes and the operating paradigms would favour “real time”. In fact, there would be ‘Autonomous Reactions’ using Artificial Intelligence (AI) and Machine Learning (ML). The applications for utility open floodgates for a fertile mind, harnessing the power of parallel computing at the ‘EDGE’, without having to send information back and forth to/from the backend Servers for processing. Logics, intricately curated on microchips, churn out options ~ real time.

In fact, there would be ‘Autonomous Reactions’ using Artificial Intelligence (AI) and Machine Learning (ML).

The Chilling Reality

For the suitably endowed, wearing the cloak and hat of a developer (without a dagger in hand), appreciating the fast evolving canvas, the scenario for the appearance of the Autonomous Weapons on the battle field, with the ability to take decisions to engage a threat manifesting at close quarters or stand-off, the decision to kill humans without human intervention is indeed a chilling thought. Add to this the inability to distinguish between a civilian and a soldier or a friend and a foe ~ all adding to the complexity; Not to speak of a machine going rogue, amply exhibited in numerous movies like ‘Ex Machina’, a 2015 sci-fi, that follows a programmer who has been instructed by his boss to give the “Turing Test” (a test that compares human intelligence to a computer’s) to a human-like robot.

The interesting part depicts how Ava, the protagonist robot, gains her knowledge. Ava’s intelligence comes from “Blue-Book”, a fictional search engine, like Google. She can collect information from what people share online and build her behavioural patterns accordingly. This has evolved into what is loosely being referred to as ‘Deep Learning’ by the machines. Google recently released the “Dataset Search”, a free tool for searching 25 million publicly available datasets, using dataset publishers for use of the open standards to describe their dataset’s metadata. With this treasure trove of data to harness, the scavenging software algorithms can get meaty chunks to train for perfect processes and responses.

The Sensor Matrix

As per the US based, National Science Foundation, Internet of Things (IoT) is on track to connect 50 billion “smart” things by 2020 (the slow-down on account of the ongoing Pandemic having slowed the graphs) and a trillion soon after. Developed initially as safety systems in industrial platforms, to warn operators of ensuing failures in the components, sensors have come a long way since. For real time decision-making, information collected through various sensors is converted into complex algorithms for analytics and functioning. The efficiency of the analytics depends on the quantum of datasets gathered. Today, this simple gathering of data for interpretation is done by ‘smart fusion of sensors’ that are weather and temperature agnostic, fusing the positive attributes of some while negating those of others; passing them through Predictive Algorithms to enhance detection with high efficiency for multiple utilities like Situational Awareness, Early Warning, Enterprise Security, Surveillance and Predictive Analytics amongst the scope of applications.

Imagine a host of vectors using multiple sensors, communicating the gigantic data sets over allocated bandwidths to converge at the Command and Control (C2) Centers. The specter indeed appears convoluted!

These versatile platforms factor attributes of sensors to perform analogous tasks, often, for real time feedback. Sensor fusion over one entity is the simpler scenario to conjure. Imagine a host of vectors using multiple sensors, communicating the gigantic data sets over allocated bandwidths to converge at the Command and Control (C2) Centers, who would then be required to seep this data to separate garbage from actionable information, convert it into intelligence and make it available near real time for suitable action at the execution end, before which the same cycle is repeated with the ever evolving dynamic situations. With 5G in the offing and IoT users vying for band-space for their vectors, the specter indeed appears convoluted.

The US Defence Advanced Research Projects Agency (DARPA), periodically advertises DARPA Challenges, to identify suitable partners from amongst academia, researchers and start-ups at solving vexations, promulgated as ‘Problem Statements’, every year. In the year 2011, DARPA rolled out a programme called, the ‘Mathematics of Sensing, Exploitation and Execution’. A number of Universities and companies responded to the challenge and post grants from DARPA, earnestly worked on a mathematical language to weed through the sensor data, communicate it to the C2 hubs and react suitably to the emerging threats. It may be appreciated that post Network Centric Warfare (NCW) postulations, simultaneous data received from Drones, Radars, UAVs and cell phones generating feeds of IR, video and ranging sensors demand alacrity of the systems at play with tempered output response. The System-of-Systems that such architecture demands is convoluted, intense, agile and highly reliable. The processors to throughput and translate all information into a common language renders into complex logic algorithms using Neural Networks, Quantum Computing, Parallel Processing and self-generating codes powered by AI and ML.

Data points generated on point cloud using Neural Networks

Cut to the present day where the ongoing Pandemic, COVID-19 has had large corporates re-align their products to assist in the fight and remain afloat. Some governments are employing ubiquitous sensors and complex algorithms to carry out 24×7 surveillance through advance facial recognition software, the security concerns on privacy looming large. Close monitoring of smart phones, sensors to check body temperature, database to collaborate movements and interactions through applications like the Indian COVID-19 tracking mobile application, ‘Aarogya Setu’, developed by the National Informatics Centre to connect essential health services with the people in an endeavour for a combined fight against the pandemic.

“Thucydides Trap”; As per Allison, the Theory espouses that ‘when one great power threatens to displace another, war is almost always the result’.

Manoj Joshi, an Indian journalist and author, a Distinguished Fellow at the Observer Research Foundation, a New Delhi-based think tank, in an appropriately articulated article titled “Thucydides Trap” in the editorials of the Times of India, a while ago looked at the Pandemic through a strategic lens and running it through the expounded theory of the Thucydides Trap, made famous after a book by Graham Tillett Allison Jr, an American Political Scientist and Professor at Harvard, “Destined for War”. As per Allison, the Theory espouses that ‘when one great power threatens to displace another, war is almost always the result’. The obvious reference here is to US and China. What transpires since, is that these powers would imbalance other regional powers in their race for supremacy. International Relations theorists will do well to factor collaterals in the Thucydides Trap paragon, with China’s threatening posture in the South China Sea against the littorals. The Pandemic has further worsened the situation, if at all, between the two competitors. A combative approach rather than a conciliatory one will only confirm Allison’s hypothesis. For the moment, back to reconciliation between the sensors at least!

3D Sensors

As against utility in the two-dimension space of either length and width or the X-axis and the Y-axis, the three-dimension space gives you the perception of depth or the Z-axis. This perception replicates scenes the human eye is accustomed to see, provided the data generated by the sensors is processed on ‘appropriate platforms’ and packaged to the user in the format the human eye perceives. This ‘platform’ assumes many dimensions and is directly proportional to the amount of raw data a particular sensor/ group of different types of sensors render. It also depends on the medium of relaying this data back to the processor, the quality of algorithms and logics written by embedded backend software engineers that interpret this machine data using AI and ML, processing speeds, material of the semi-conductor used and thence-on generating a human view like replication of what the sensors are seeing. The processor, software and material of the semiconductors is packaged on a board like the photograph at the beginning of this article and processed through an appropriate processing platform. For big data, multiple computing architectures are generated. The Graphics Processing Unit (GPU), parallel computing and Hardware for Deep Learning are packaged to deliver the requisite perception. This simplistic operating principle of processing 3D data is scaled up exponentially depending on the application (Use-case).

Sensors like Light Detection and Ranging (LiDAR) are being exploited for their potential. LiDAR is a remote sensing device that measures distance by illuminating a target by Laser and mapping the reflected light on a point cloud

3D Sensors are also being articulated extensively in the medical and industrial applications. mmWave Radars that penetrate walls have also been developed and Sensors like Light Detection and Ranging (LiDAR) are being exploited for their potential. LiDAR is a remote sensing device that measures distance by illuminating a target by Laser and mapping the reflected light on a point cloud. Used extensively, it generates possibilities for detection, monitoring and surveillance, amongst others.

The Automatic Driver Assistance System (ADAS), an application for breaking a driver’s monotony through the inputs from sensors deployed on the vehicle, a perception layer is visible to the driver of his surroundings for avoiding accident by the sudden movement of a pedestrian/road user in the vehicle path. Sensors also placed inside driver cabins monitor and detect driver motion and issue suitable warning to the co-driver, when the driver tends to get drowsy and sleepy while driving by recording the change in the driver posture/ head movement, triggering warnings either by a haptic movement of the steering or acoustic warning through the beep of the horn, saving the vehicle and crew from accident.

Solution providers are sensitive to the concerns of security, information breach and masquerading attacks. Technology now offers solutions to these intrigues, when rather than have the data converge at servers (security of which is an attendant concern) over the internet, EDGE processors carry out the requisite processes at the client end itself and build perception, ensuring the degree of security. The client remains the prime custodian of his data.

Horizon

In 1999, India was quipped to be a reigning power like the rosy dawn of the 21st Century. Undoubtedly, with its technological prowess being harnessed in earnest, the rosy dawn shall be for a long time on the horizon and motivate other genetic offspring to master the craft for diverse applications.

Sensor utility manifests into applications that provide for solutions, hither-to-fore considered in the realms of fantasy. The sensor market has a multitude of variations to offer as per utility. There are the Active sensors, Smart sensors, Intelligent sensors, Camera sensors, IR sensors, Laser sensors for detection and engaging targets, Radar sensors for Synthetic Aperture Radars (SAR), Micro-Electro-Mechanical Systems (MEMS), Nano-sensors and Wearable sensors. LiDAR application is becoming rampant for its accuracy and scalability in the Defence, Security, Automotive, Aviation and Aerospace industries. Space based LiDAR systems are now in the offing due to their coherent detection principles giving better Signal to Noise ratio (SNR) outputs. The Indian Cartosat-2 series of satellites are a set of advance remote sensing satellites, used for cartographic applications, of immense value to the defence for studying minute details for specific operations.

The economic angle can hardly belie the sensor market with the 3D on an anticipated growth trajectory of CAGR 26.54% up to 2023 and the MEMS share of $1628.57, projected to grow to 33% by the same year. This is another specter that beholds growth space for both the military and civil applications alike.