Even as AI and other fields of computer science continue to advance, there are many tasks computers can’t perform that human beings have little to no trouble with. Consider, for example, an image recognition reCAPTCHA test. Bots struggle with them because they see images for the pixels they’re made of, not the shapes they contain. Humans can easily recognize these shapes–clouds, traffic cones, stop signs–but the majority of bots simply can’t.
Object recognition is just one of many tasks that computers currently cannot hope to emulate to the degree that most people can, and while this means bots will continue to have trouble accessing websites, the inability to recognize patterns and understand context somewhat limits their usefulness.
Promising solutions to this problem can be found in the field of cognitive computing, a subsector of computer science that uses computer models to closely simulate human cognition through machine learning and several narrative-based techniques. Although cognitive computing as a concept is still fairly young, it’s already seeing some use in programs like MyIQ, a structured diagnostic system that tracks how users process information and manage behavioral friction.
Diagnostics as an adaptive framework
As a diagnostics system, MyIQ observes user inputs to determine how a given user thinks, where their focus wavers, and how a user’s emotional state changes while under stress. The program makes these observations through an adaptive IQ assessment that learns from and responds to how a user answers each question. These adaptations are later analysed to identify patterns in user responses that can be interpreted over time.
The platform also features a personality inventory and relationship diagnostic toolset, both functions of which utilize user inputs to offer descriptive behavioral data. MyIQ presents this information in an ingestible manner, with users subsequently using the information to inform potential areas of weakness in terms of how effectively they pick up on certain patterns.
Finding clarity in ambiguity
Ambiguity remains one of the more difficult concepts for computers to comprehend, in large part because computers are rules-based by design. This strict adherence to a limited set of parameters makes it difficult for computer programs to navigate ambiguity in speech and images, a barrier that cognitive computation has since begun to overcome through the use of neural networks, a type of AI structured like the neural pathways found in the human brain.
Like a human brain, these networks are able to improve over time by learning from each piece of data they are fed. MyIQ follows a similar process as part of its cognitive assessments, thereby making it possible for the system to learn from its users and alter the questions it poses in kind. Some of these questions can even incorporate navigating ambiguous language, a task most programs are known for struggling with.
New means of computing cognitive data
Cognitive computing has simplified the process of acquiring meaningful cognitive data, i.e., information derived from how people think. This data doesn’t come from measurements determining which boxes a user clicked when taking a test, but instead how they think about issues involving complex subjects like ambiguous language.
Programs like MyIQ make it possible to record previously incomputable information and use it to analyze abstract concepts like adaptability and behavioral change. While this technology is still actively being developed, its ability to learn from regular use makes it a potentially powerful tool for viable mental diagnostics.
VentureBeat newsroom and editorial staff were not involved in the creation of this content.
