IBM Watson:  Application of Intellectual computing in Life Sciences Study
IBM Watson: Application of Intellectual computing in Life Sciences Study

Life Sciences Researchers are under pressure to innovate faster than ever. Big data offer the promise of unlocking unique insights. Today data is of utmost importance in every field ranging from Internet data to data collected from customers. Nowadays we can see that in every shopping malls, cabs drivers, home delivery services, fast food chain stores take feedback from their customers because they wanted to gather data in order to expand their business with the help of data which permanently give them a huge customer database. Although more data are available than ever, only a portion of it is being executed, understand and analyzed.

New technologies like cognitive computing proposed promise for inscribing this dare because cognitive solutions are exclusively designed to integrate and analyze big data sets. Cognitive solutions are designed to understand various datasets in a structured database. Cognitive solutions are structured to understand technical, industry-specific content, advanced reasoning and predictive modeling.

Watson, a cognitive computing technology, has been configured to support life science research. The Watson edition encompasses medical literature, patents, chemical and pharmaceutical data that researchers would commonly use in their work.


Many people know Watson as the IBM-developed cognitive supercomputer that won the Jeopardy game show in 2011. In truth, Watson is not actually a computer but a set of algorithms and APIs. IBM has used in every industry from finance to healthcare.

Recently IBM has announced several new partnerships which focus on taking things further and level extent. Also, it puts its cognitive capabilities to solve a new set of technologies around the world.

Meaning of Cognitive computing as per the Vice President of IBM Watson Steve Gold, what started with his own company’s development of tabulation computing, to process US census data at the inception of the 20th century developed into programmatic computing in the middle of the century, with the arrival of transistors, relational databases, magnetic storage and microprocessors. We can see the tremendous growth in unstructured data we have faced in recent past years, the artificial methods that have been developed to help us make sense and learn from this data have given rise to cognitive computing. These cognitive computers don’t need to be programmed- they can learn from themselves.

The concept works like traditionally, computers have done what we tell them to do. We give them command and they need to follow that command as computers were already programmed to follow commands. We give them code containing instructions whatever we want them to be done and they follow the orders and carry out the result in the same fashion. There is a limitation that they can only do whatever we want them to do things and what we teach them to do. We can’t simply order computers to develop a remedy to cure AIDS and expects the results for the same. But considering the fact that the computers can process the data and information far faster than a human can do. It will be much efficient to give them all the data available and let them work out the best solution for it.
Image Courtesy: Getty Images