1.How machine learning and artificial intelligence are driving cognitive behaviour in data and analytics
Artificial intelligence (AI) is emerging once again in our IT vernacular. Decades ago is was intended to capture the intelligence of knowledge workers and subject matter experts, codifying their skill sets so-called “expert systems” that can execute repeatedly on demand. It didn’t work out so well: computer, storage & network resources were too slow and expensive. Moreover, knowledge changed too quickly, and codifying what remained valid required too much effort. But things are different now. Compute, storage and network resources (ubiquitous Cloud solutions) are cheap and fast and getting more so all the time.
The approach to AI has changed as well. Data volumes continue to grow exponentially, and human kind now converts data into knowledge at unprecedented rates, making it a constant challenge to codify such expertise and intelligence in software.
In this session we will explore the phenomenon of how our systems need to learn, and those that do, also need to be put to some practical use.
2. Digital disruption and its impact on global corporations
Processes that were built for yesterday were designed to address a very specific set of circumstances – ones that no longer exist or have lost relevance – leaving businesses playing catch-up with the new digital economies of consumption. In order to succeed in the ever- evolving, digital world, companies are both renewing their legacy landscapes while simultaneously creating new technologies and processes.
It’s clear that we are witnessing several transformative trends that are intersecting to create "combinatorial innovation,” including mobile, sensor technology (smart, connected devices), big data, on-demand cloud computing, machine learning and predictive analytics.
In this session we will explore Platform Thinking – The New Normal, which provides a perspective on how some of the fastest-growing companies (Alibaba, Airbus, Apple, Amazon, Facebook, Google, and Uber) are redefining transactional relationships – where producers are consumers, and consumers are producers. These companies have re-imagined technologies to enables brands and their customers the ability to interact and create shared value in ways not thought possible.
3. R Programming Hands-on sessions
R is an open source programming language and software environment for statistical computing and graphics that is supported by the R Foundation for Statistical Computing.
The R language is widely used among statisticians and data miners for developing statistical software and data analysis.
4. Hadoop and Cassandra Hands-on sessions
The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage.
Note: Prior exposure to Core Java is desirable
Apache Cassandra is a free and open-source distributed database management system designed to handle large amounts of data across many commodity servers, providing high availability with no single point of failure.