POV - W Artificial Intelligence & Machine Learning

By Mahendra K Upadhyay, Head of Data & Technology, Mindshare India

We are living in times where computing is not just computing but its cloud/ Edge computing; and most importantly data is big Data; schooling thoughts on data mining & analytics have redefined. Artificial Intelligence(AI) & Machine Learning (ML) are the two new terms for analytics use cases are prominently used.

People normally use these two terms interchangeably but AI is the universe of use case ecosystem, whereas ML is set of algorithm components within the use case framework. Machine learning was there in the usage for quite long time and has been predominantly existed in the areas of Manufacturing and Home appliances side e.g. Trains, Cars assembly lines, ACs, washing machines, etc.

ML is a method where you train the machine (algorithm itself) with continuous data over period of time and slowly the algorithm itself start correcting / improving its output basis each outcome. Most of the algorithms are same as we used in past like decision trees, random forests, KNN, regression methods, Neural Networks etc.

The Beginning:

1990s was all about Y2K and traditional process automation using “operational systems” which can be termed as Billing, CRM, Finance or supply management systems. This is where people learnt how to use computers and applications.

New century was all about use cases around Internet and public platforms for information exchange and dominated by people like Yahoo!, Google, FB. This is where we were learning how to generate information.

But at this time arrival of Apple and Amazon changed the way industries used to see use cases around data and contextual communication; not only with people but with machines. This was the time when Big data, AI and ML algorithms started gaining momentum and discussed almost on all forums / industries discussions because of two sheer reasons –

a. Everyone was connected on internet and was available for communication/ ideas
b. Everyone was claiming to be in demand of something on that very moment

Previous analytical use cases were limited to predict consumer behavior needs in near future like credit scores, propensity to buy, propensity to leave, sales/demand forecasting etc. and none of the use cases were around real time context exchange or requirements.

Information exchange with cloud computing gave birth to a new kind of analytics processing that is learning self and determining the answers/scores in near real time.

Algorithms are trained with the following methods:

a. Supervised learning – training data set have evidences of outcome
b. Unsupervised learning – outcomes are wide enough and can further become input dataset
c. Start with supervised and later unsupervised – best of both approaches

Data is the soul for this entire ecosystem as algorithm can predict as best as the data inputs are, hence everything starts with data only and output is nothing else but again a data point.

Below mentioned diagram depicts how AI/ML framework is used for different outcome use cases in industry on AWS cloud environment.

Some of the key use cases came out from AI / ML evolutions which are changing our world around following areas: (each area is very big and can be discussed as separate discussion thread)

- Customer Care & Marketing personalization
- Financial Trading & Payments
- Transportations & Optimizations
- Healthcare and Medicines
- Manufacturing and assembly lines

Next Wave of Automation:

With more and more data available for analysis and machines are more connected on faster networks; new use cases arrived which were not though before. This gave birth to new kind of ecosystem commonly known as IOTs (internet of things). Connected devices, connected cars, connected homes hence connected minds.

It is estimated that there will be more than 50bn connected devices by 2022 and humans may not be initiating each intent for need rather these devices will be taking actions/ generating call for actions based on humans.

Natural language processing (NLP), Voice enabled ecosystem (no more keyboard typing), censor data processing is taking consumer experience to next level; Alexa and SIRI will be known names in every household soon. Keeping those in minds, following three areas are the defining blocks of any ML/AI initiatives of future:

• Contextual Consumer experience: Digital transactions require very high throughput and concurrency to operate whereas claimed TPS from ZIlliqa is around  2500 transactions per second only

• Machine Assistance: Every technology benefit comes with its own setup and license costs; since there are not many success stories around no one really knows the tech-cost vs benefit study for client /agency documented yet

• Less Human Errors: there are not many clients / agencies who can really claim of already implemented use cases on their premises using DLT hence a lot is only talked claims then implemented ones.

Summary:

After data, AI/ML is the most abused word in today’s technology discussions; everyone talks about these areas but there are few who have really done it. To start AI / ML journey first key is to have “proper data strategy” and how this can be linked to overall business / consumer digital landscape.

Second thing is about identify right use case and how it’s impacting  internally / externally. Once these two are put together properly next thing is to identify right partner who can make this happen for you.

Don't Miss ( 1-5 of 25 )