Just ask Alexa, The machine in the corner room

AGCS (Alexa, Cortana, Google, Siri), as I fondly call these services, certainly have taken over my life. I talk to them every day. I take their help in research. I tell them to remind me of important tasks. I even ask them turn on / off different appliances at home. And they never complain! Now who yells at me or more…
Business insights, anytime, anywhere -- Just ask Alexa!
Ganesh Moorthy
Ganesh Moorthy
Director – Engineering, Tredence

AGCS (Alexa, Cortana, Google, Siri), as I fondly call these services, certainly have taken over my life. I talk to them every day. I take their help in research. I tell them to remind me of important tasks. I even ask them turn on / off different appliances at home. And they never complain! Now who yells at me or more importantly who do I yell at?

The last few years have been a point of inflection in the area of personal assistants or PIP (Personal Informational Programs). They have gained a voice of their own – to say the least. Voice enabled assistants or voice assists are an evolution in human-machine interactions. When I say, “I speak with Alexa,” people are no longer surprised. They are just confused – am I referring to Alexa service or to a real person! Now that’s what I call the first step in machine takeover – the blurring!

Some serious business:

At Tredence, we have been experimenting with Alexa for a couple of months now. What started out as an exploratory process (Who and What is Alexa and How can I have a communication with her) has led to a more objective driven program. We like to call this Voice Enabled Insights (VEI).

By integrating Alexa with Tableau, we have managed to provide a short synthesis of how the business is performing. And the best part, the insights are refreshed every morning. Operational managers can now have free-wheeling conversation with this voice enabled feature, enhanced with tableau. What a way to consume your morning coffee insights! The icing on the cake is that our system also crawls the web to provide competitor information so you cover the complete landscape. And the, if you want to discuss, you can ask humble Alexa to schedule a meeting with your required stakeholders (let’s say territory manager) through O365 integrations.

So far, we have taken a small step towards a future that is closely integrated, connected and alive all the time – thanks to voice enablement. Looking into the future, imagine a situation where a patient’s family speaks to the panel and ask for the patient’s condition! They are received with prompt information on the room, current health parameters and in-operation status, if the patient is being recorded live. No more long waiting time and anxiety attacks at the help desk.

How about doctors? They can optimize their time by getting critical patients conditions and issuing necessary instructions to nurses, in near real time. The same goes for any enterprise where there is a lot of personal interactions between service provider and consumer.

Now that we covered the most important aspect from personal standpoint – ‘Health’, let’s move to industrialization and the phenomenon of IoT. There have been rapid advancements in the areas of machine to machine communication and the so-called intelligent machines. Add a larynx (voice-enabled feature) to this combination and I can simply step up to a panel and enquire: what has been the output so far, are there any issues with the systems, and issue commands to reroute if there is a line fault. All of this without even lifting a finger, literally “speaking”!

In most cases, what we discussed is the benefit of the voice-enabled feature in a B2B or B2C scenarios. But this is not just it. The corner room assistant can help provide on-demand and interactive directory services, serve as a knowledge bank, and project manage. She can facilitate efficiency, timely decisions, and can also gamify training using skill and stories based mode for self-learning. Simply put, all we need to be is creative; the tools are already getting quite smart to say the least.

It is a given today that Alexa and other services are changing the world and how we interact with it. With time to act constantly getting shorter, these disruptive innovations will play a greater role in how connected we are. Voice enabled insights, while not new in concept (remember IVR’s), is beginning to gain popularity owing the rapid propagation of machine learning and artificial intelligence. They are simply becoming more human in their interactions. It would be wise to get on the race sooner. But here’s the deal, start out on the journey in incremental ways and then scale. Soon there will be a time where we will adjectivize and say, ‘Just Ask Alexa!’

Second spin: Driving efficiency and happiness through PMO

In my previous blog, we looked at how the Project Management Organization (PMO) at Tredence enables excellence across projects. In gist, traditional PMOs focus on ensuring projects are completed on schedule, and processes are followed the right way. At Tredence, the PMO group allows improved…
Second spin: Driving efficiency and happiness through PMO
Sanat Pai Raikar
Sanat Pai Raikar
Senior Manager, Tredence

In my previous blog, we looked at how the Project Management Organization (PMO) at Tredence enables excellence across projects. In gist, traditional PMOs focus on ensuring projects are completed on schedule, and processes are followed the right way. At Tredence, the PMO group allows improved project planning, monitoring and control.

In this blog, we will look at how PMO at Tredence drives efficiency on a day-to-day basis, which in turn drives improved work-life balance for employees, as well as improved quality and satisfaction for our clients.

Fostering an efficiency based mindset is key – constant improvement manifests itself not just in improved quality, but better job satisfaction as well

Stuck in a rut

Analytics services teams typically follow two modes of operation – medium-term to long-term projects to solve specific business problems, and continued engagements to answer quick turnaround requests from clients. The latter typically involve same day deliverables, which lead to a constant time crunch situation for teams. Teams working on such projects have to, in a way, complete a mini analytics project within a day. This leads to immense pressure in planning one’s day and completing all tasks as per client needs. As time passes, employees in such teams face a burnout as they work day in and day out on similar tasks. Besides, a tendency to be able to do the job eyes shut also creeps in, leaving no room for innovation in the interest of urgent deliverables.

Tracking without tracking

As soon as a process or standard method of doing a set of tasks is introduced, it is immediately countered with resistance from employees, who are used to working without processes. So, if I compelled all employees to, say, track their time on an hourly basis and penalize them for all slips from the plan, I can guarantee that no one will follow it; even if they do, it will be with utmost reluctance and copious stress to themselves.

Alternately, imagine I set a guideline to the tune of “We will all endeavor to leave by 7 PM every day.” No pressure here! But if an employee is not consciously trying to improve, and then observes most of his colleagues leaving before 7(PM), chances are he will start thinking about following the “best practice” himself. This is a passive way of fostering efficiency and change management.

One can define a hundred processes in the interest of efficiency improvement, but unless individual employees buy in to the concept, it will all fail

Passive is not enough

Of course, it will not do to expect things to improve of their own accord. The above strategy can at best lead to incremental improvements, and at worst not help matters at all. PMO needs to actively foster a culture of continuous improvement. At Tredence, we have worked closely with delivery teams to help them identify the sources of inefficiency. These could be external causes, such as latencies linked with client based infrastructure, or traffic woes at rush hour. Causes could be internal as well, such as promising more than we could deliver, or going about work in a non-optimal manner. By quantizing the time lost due to each of these causes, we have directly addressed the reasons for inefficiency, fixed them to the extent possible, and created time for employees.

Out of the rut

Once employees realize that the organization is bought into the concept of helping them gain more time out of a day, they buy into the initiatives as well. The value they see coming out of such initiatives justifies the time they spend on providing data / reports for further improvement. As this percolates across levels, employees feel empowered to innovate themselves and the work they do on a daily basis, continuously making themselves as well as their colleagues better.

At Tredence, we have enabled multiple teams to identify causes of inefficiency and act on these with improvement goals in mind. The time saved has enabled employees to invest not just in providing more value-added services to our clients, but also to themselves – utilizing the time for learning new skills, improving themselves and getting better at what they do.

How does the PMO team in your organization go beyond just process excellence? Share your thoughts and best practices with us in the comments section.

A new spin to PMO: Driving excellence in a complex business environment

Go to any of the myriad analytics services providers that proliferate the industry today, walk up to any manager, and ask him if any of the analytics projects he works on is similar to the other. Chances are extremely remote that you will receive a response in the affirmative…
A new spin to PMO: Driving excellence in a complex business environment
Sanat Pai Raikar
Sanat Pai Raikar
Senior Manager, Tredence

Go to any of the myriad analytics services providers that proliferate the industry today, walk up to any manager, and ask him if any of the analytics projects he works on is similar to the other. Chances are extremely remote that you will receive a response in the affirmative.

Let’s go one step further. Ask the manager how easy it is to hire people with the right skills for different projects, ensure they learn on the job, while being efficient all through. Be prepared for a long rant on the complexities and vagaries of finding good talent and utilizing it to the fullest.

PMO enables application of what we sell, analytics, to our own processes for betterment and continuous improvement

Challenges at scale

You would have figured out by now that analytics services companies enable their clients to solve complex business problems. And since each business problem is unique, the approach taken to solve it becomes unique as well. This leaves us with a large set of unique, mutually exclusive analytics projects running at any given point in time; each requiring a separate set of resources, time and infrastructure.

Small analytics organizations can handle this complexity because of multiple factors – a very strong and smart core team, fewer projects to manage, and lower layers of hierarchy within the organization. But as the analytics services company grows, it becomes increasingly difficult to ensure each project is running efficiently and on the right track. The problem is exacerbated by two facts: the flexibility of a startup is not easily scalable; and resistance in putting process to bring some order in to the system is something employees – especially old timers – chafe at. This is where the prominence of PMO kicks in.

Setting up, and moving beyond the traditional PMO

When a startup evolves into a mature, established analytics services company, it usually veils the fact that the company lacks strong processes to scale. In the absence of organization-wide standard processes for running projects, processes in silos start to take form, or in some cases the absence of it altogether.

But this leads to inconsistencies in how project delivery is executed. Similar projects are often estimated in different and sometimes erroneous ways; projects are staffed with people who don’t have the right skills, and knowledge often gets lost when team members attrite. Adding to the list of pains, projects don’t get invoiced in time, invoicing schedules are not consistent, and many projects are executed without formal contracts in place. Senior leadership also lacks a common view into the health of project delivery and the pulse of resources working on these projects, at the ground level.

A good PMO organization faces the same problems as a kite flyer – too many processes, and the kite will never take off; too few, and the kite flies off into the wind. But kite flying technique is important as well.

The focus of a traditional Project Management Organization (PMO) is more towards ensuring projects are completed on schedule, and processes are followed the right way. However, for true maturity in delivering analytics services, PMO needs to move beyond just process focus. It should allow improved project planning, monitoring and control

It should ensure the right issues are identified at the right time and addressed accordingly. It should ensure people across the organization speak the same language and terms, and provide the leadership team a single view into business performance. At the tactical level, a PMO group should help employees become more efficient and process-oriented. It should foster a culture of accountability, automation and quality control to ensure improved satisfaction for clients as well.

The right level of process

Setting up a PMO group is only half the battle won. The PMO setup needs to regulate the proverbial oxygen flow so employees don’t feel constricted in a mire of process bureaucracy; or on the other hand continue in a false euphoria of individual project flexibility. Internal change management needs to be a smooth process. While adding processes layer by layer, care needs to be taken to ensure that employees do not feel “pained” by the PMO “demands”, in addition to their day to day deliverable.

At Tredence, the PMO drives improved quality and timeliness of work outputs, while also serving as a means to achieve work-life balance for our employees. Through a well-planned alignment of employees to the projects, which best match their skills, we ensure each team is best equipped to deliver more than the promised results to our clients. In our next blog, we shall discuss in more detail how our PMO group drives improved efficiencies within Tredence and makes our employees more efficient and happy.

So what does the PMO role in your organization look like? Share your thoughts and best practices with us in the comments section.

Data Lakes: Hadoop – The makings of the Beast

1997 was the year of consumable digital revolution – the year when cost of computation and storage decreased drastically resulting in conversion from paper-based to digital storage. The very next year the problem of Big Data emerged. As the digitalization of documents far surpassed the estimates…
Data Lakes
Ganesh Moorthy
Ganesh Moorthy
Director – Engineering, Tredence

1997 was the year of consumable digital revolution – the year when cost of computation and storage decreased drastically resulting in conversion from paper-based to digital storage. The very next year the problem of Big Data emerged. As the digitalization of documents far surpassed the estimates, Hadoop was the step forward towards low cost storage. It slowly became synonymous and inter-changeable with the term big data. With explosion of ecommerce, social chatter and connected things, data has exploded into new realms. It’s not just the volume anymore.

In part 1 of this blog, I had set the premise that the market is already moving from a PPTware to dashboard and robust machine learning platforms to make the most of the “new oil”.

Today, we are constantly inundated with terms like Data Lake and Data Reservoirs. What do these really mean? Why should we care about these buzz words? How does it improve our daily lives?

I have spoken with a number of people – over the years – and have come to realize that for most part they are enamoured with the term, not realizing the value or the complexity behind it. Even when they do realize, the variety of software components and the velocity with which they change are simply incomprehensible.

The big question here would be, how do we quantify Big Data? One aspect to pivot is that it is no longer about the volume of data you collect, rather the insight through analysis that is important. Data when used for the purpose beyond its original intent can generate latent value. Making the most of this latent value will require practitioners to envision the 4V’s in tandem – Volume, Variety Velocity, and Veracity.

Translating this into reality will require a system that is:

  • Low cost
  • Capable of handling the volume load
  • Not constrained by the variety (structured, unstructured or semi-structured formats)
  • Capable of handling the velocity (streaming) and
  • Endowed with tools to perform the required data discovery, through light or dark data (veracity)

Hadoop — now a household term — had its beginnings aimed towards web search. Rather than making it proprietary, the developers at Yahoo made a life-altering decision to release this as open-source; deriving their requisite inspiration from another open source project called Nutch, which had a component with the same name.

Over the last decade, Hadoop with Apache Software Foundation as its surrogate mother and with active collaboration between thousands of open-source contributors, has evolved into the beast that it is.

Hadoop is endowed with the following components –

  • HDFS (Highly Distributed File System) — which provides centralized storage spread over number of different physical systems and ensures enough redundancy of data for high availability.

  • MapReduce — The process of distributed computing on available data using Mappers and Reducers. Mappers work on data and reduce it to tuples and can include transformation while reducers take data from different mappers and combines them.

  • YARN / MESOS – The resource managers that control availability of hardware and software processes along with scheduling and job management with two distinct components – Namely ResourceManager and NodeManager.

  • Commons – Common set of libraries and utilities that support other Hadoop components.

While the above forms the foundation, what really drives data processing and analysis are frameworks such as Pig, Hive and Spark for data processing along with other widely used utilities for cluster, meta-data and security management. Now that you know what the beast is made of (at its core) – we will cover the dressings in the next parts of this series. Au Revoir!

No more scampering for data, with the rise of metadata

Picture this. You’re looking to purchase an SLR camera. Without any further ado, you visit amazon.com to check out the best deals. You find quite a few and add them to the cart, while continuing to review more detail. Two days later, having done all your due diligence, you decide to purchase –checkout…
Sanat Pai Raikar
Sanat Pai Raikar
Senior Manager, Tredence

Picture this. You’re looking to purchase an SLR camera. Without any further ado, you visit amazon.com to check out the best deals. You find quite a few and add them to the cart, while continuing to review more details. Two days later, having done all your due diligence, you decide to purchase – checkout. In a matter of few days, you are the proud owner of an SLR camera.

Now, imagine the same level of ease in obtaining data that matters to you – irrespective of the 4Vs!

But this scenario is not easy to come by. In analytics, we generally use the phrase ‘Insights are only as good as the data we use’. The reason many analytics projects start with this proviso is not because a lot of data is noise, rather a lot of potentially useful data is not defined correctly, rendering it unusable and leaving the analytics solution incomplete.

Metadata helps plug this gap.

Expanding the scope of metadata

The world of analytics is closely tied to the notion of big data – larger and larger volumes of data which need to be processed to obtain meaningful business information. The big boom we have witnessed in the recent past though is the rise in varietyi of data sources available – everything from voice conversations to product searches on an e-commerce website to people movements tracked by satellite.

But here’s where we face a conundrum – the data we’ve been accustomed to thus far was organized, structured, usually available in a tabular or database format. As the number of data sources grow, data formats also multiply. The reality – it is no longer humanly possible to create metadata for all the information flowing in. However, it will be necessary to know all we need to about the data within the various sources if we are to use it effectively. Making the most of it will require a clear definition of these data sources, if it were to be used for relevant insights generation and consumption. It will be equally important to leverage the basic knowledge that data analysts possess at the tips of their fingers: data, quick summary statistics, data size, dimensions, etc..

Metadata rises to the occasion

In its simplest form, metadata provides that much-needed hygiene; it describes the data structures available to us – column titles, data formats, etc. It describes how the data is organized, in terms of file type, when it was created and last modified, and how we can download data from it. Metadata contextualizes data.

A metadata-based approach will enable organizations to work with all their data assets within the same environmentii . It provides a consistent definition, establishes relations and traceability back to the origin of the data set in question.

So, how does the metadata phenomenon play up in an organization?

Data consumption, governance irrespective:

There are organizations that have fixated themselves on their data governance model – centralized or decentralized. Whichever way they sway, metadata ensures business continuity. It translates analytics investment into context and relevance. The smart Metadata helps identify linkages across data sources. It allows teams to collaborate across their internal firewalls.

Monetizing on data from the start

Across the descriptive, inquisitive, predictive and prescriptive analytics spectrum, metadata provides the security of validated data – thanks to its nomenclature and demography.

Faster data consumption

The discipline embedded in metadata translates into ease of analyzing data with the help of quick self-serve tools. This leads to efficient business analysis and insights gleaning off the data. Add a layer of machine learning and the task of finding and defining data is pretty much automated.

In this new age of data analytics, we can now safely say that metadata is no longer just “data about data,” rather a means to also uncover new truths about data.

At Tredence, we don’t just answer questions our clients bring to us. We use strong machine learning and data manipulation skills to augment our clients’ data with publicly available information, leading to more robust and actionable business insights.

To know more about Tredence and our offerings, click here.

i https://www.tableau.com/resource/top-10-big-data-trends-2017
ii https://blogs.informatica.com/2015/12/23/data-management-trends-in-2016-our-predictions/#fbid=gfs8xa7fvYn

From the norm to unconventional analytics: Beyond owning, to seeking data

The scale of big data, data deluge, 4Vs of data, and all that’s in between… We’ve all heard so many words adjectivized to “Data”. And the many reports and literature has taken the vocabulary and interpretation of data to a whole new level. As a result, the marketplace is split into …
Shashank Dubey
Shashank Dubey
Co-founder and Head of Analytics, Tredence

The scale of big data, data deluge, 4Vs of data, and all that’s in between… We’ve all heard so many words adjectivized to “Data”. And the many reports and literature have taken the vocabulary and interpretation of data to a whole new level. As a result, the marketplace is split into exaggerators, implementers, and disruptors. Which one are you?

Picture this! A telecom giant decides to invest in opening 200 physical stores in 2017. How do they go about solving this problem? How do they decide the most optimal location? Which neighbourhood will garner maximum footfall and conversion?

And then there is a leading CPG player trying to figure out where they should deploy their ice cream trikes. Now mind you, we are talking impulse purchase of perishable goods. How do they decide the number of trikes that must be deployed and where, what are the flavours that will work best in each region?

In the two examples, if the enterprises were to make decisions based on the data available to them (read owned data), they would make the same mistakes day in and day out – of using past data to make present decisions and future investments. The effect of it stares at you in the face; your view of true market potentials remains skewed, your understanding of customer sentiments is obsolete, and your ROI will seldom go beyond your baseline estimates. And then you are vulnerable to competition. Calculated risks become too calculated to game change.

Disruption in current times posits enterprises to undergo a paradigm shift; from owning data to seeking it. This transition requires a conscious set-up:

Power of unconstrained thinking

As adults, we are usually too constrained by what we know. We have our jitters when it comes to stepping out of our comfort zones – preventing us from venturing into the wild. The real learning though – in life, analytics or any other field for that matter – happens in the wild. To capitalize on this avenue, individuals and enterprises need to cultivate an almost child-like, inhibition-free culture of ‘unconstrained thinking’.

Each time we are confronted with unconventional business problems, pause and ask yourself: If I had unconstrained access to all the data in the world, how would my solution design change; What data (imagined or real) would I require to execute the new design?

Power of approximate reality

There is a lot we don’t know and will never know with 100% accuracy. However, this has never stopped the doers from disrupting the world. Unconstrained thinking needs to meet approximate reality to bear tangible outcomes.

Question to ask here would be – What are the nearest available approximations of all the data streams I dreamt off in my unconstrained ideation?

You will be amazed at the outcome. For example, the use of Yelp to identify the hyperlocal affluence of catchment population (resident as well as moving population), estimating the footfall in your competitor stores by analysing data captured from several thousand feet in the air.

This is the power of combining unconstrained thinking and approximate reality. The possibilities are limitless.

Filter to differentiate signal from noise – Data Triangulation

Remember, you are no longer as smart as the data you own, rather the data you earn and seek. But at a time when data is in abundance and streaming, the bigger decision to make while seeking data is identifying “data of relevance”. An ability to filter signals from noise will be critical here. In the absence of on-ground validation, Triangulation is the way to go.

The Data ‘purists’ among us would debate this approach of triangulation. But welcome to the world of data you don’t own. Here, some conventions will need to be broken and mindsets need to be shifted. We at Tredence have found data triangulation to be one of the most reliable ways to validate the veracity of your unfamiliar and un-vouched data sources.

Ability to tame the wild data

Unfortunately, old wine in a new bottle will not taste too good. When you explore data in the wild – beyond the enterprise firewalls – conventional wisdom and experience will not suffice. Your data scientist teams need to be endowed with unique capabilities and technological know-how to harness the power of data from unconventional sources. In the two examples mentioned above – of the telecom giant and CPG player – our data scientist team capitalized on the freely available hyperlocal data to conjure up a great solution for location optimization; from the data residing in Google maps, Yelp, and satellites.

Having worked with multiple clients, across industries, we have come to realize the power of this approach – of owned and seeking data; with no compromise on data integrity, security, and governance. After all, game changer and disruptors are seldom followers; rather they pave their own path and chose to find the needle in the haystack, as well!

Does your organization disrupt through the approach we just mentioned? Share your experience with us.

Making the Most of Change (Management)

“Times have changed.” We’ve heard this statement ever so often. Generations have used it to exclaim “things are so complicated (or simple) these days,” or expressing disdain – “oh, so they think they are a cool” generation. Whichever way you exclaim, change has been truly the “constant”….


Sulabh Dhall
Associate Director

“The illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn.”

– Alvin Toffler

“Times have changed.” We’ve heard this statement ever so often. Generations have used it to exclaim “things are so complicated (or simple) these days,” or expressing disdain – “oh, so they think they are a cool” generation. Whichever way you exclaim, change has been truly the “constant”.

This change is bolstered by a tech-enabled world where the speed at which machines are learning is accelerating – the speed of light.

Let me set this in context with an example from the book of Sales. Unlike in the past, today sales reps are not gauged by the amount of sweat trickling down their foreheads. While they continue to be evaluated in terms of business development and lead conversions, it is not all manual and laborious. Technology advancements have made the process of identifying, prioritizing, scheduling, conversing and converting agile and real-time.

But just knowing change, gathering data and appreciating technology will not suffice. The three will need to be blended seamlessly to yield transformation. Applied to deeper organizational context, “Change” needs to be interpreted – its pace needs to be matched, or even better, its effect needs to be contextualized for differentiation.

Change management in this sense is the systematization of the entire process; right from the acceptance of change to its adoption and taking advantage of it to thrive in volatile times.

But what would it take for complex enterprises, that swear by legacy systems, to turbo charge into the Change Management mode?

To answer this, I will humanize enterprise change management with the Prosci-developed ADKAR Model.

Awareness (getting into the race) – Where can I set up the next retail store, what is the most optimal planogram, how do I determine the right marketing mix, what is my competition doing different, how do I improve customer experience, how do I ensure sales force effectiveness – the questions are ample. By the time you realize and start strategizing, a competitor has dislodged your market position and eaten a large portion of your pie. And while these business problems seem conventional, volatility in the marketplace cry foul. Compound this with high dependencies on dashboards, applications, and the likes for insights, and you’ve seen the side-effects – established enterprises biting the dust.

To survive, organizations will need to be knowledgeable about data that matter viz a viz the noise. They will need to interpret the data deluge in relevance and context; after all, not all data is diamond.

Desire (creating a business case for adoption) – Desire is a basic human instinct. Our insatiable urge to want something more, even better, accentuates this instinct. When it comes to enterprises, this desire is no different; to stay ahead of the curve, to make more profits, to be leaders. But there is no lock-and-key fix to achieve this mark. Realizing corporate “desire” will require a cultural and mindset shift across the organization – top-down. And so, one of the most opportune times could be when there are changes at the leadership, followed by re-organization in the rungs below.

Gamification could be a great starting point to drive adoption in such cases. Allow the scope of experimentation to creep in; invest consciously in simmer projects; give a freehand to analysts to look for the missing piece of the puzzle outside their firewall; incentivize them accordingly. Challenge business leaders to up their appreciation for the insights generated, encourage them to get their hands down and dirty when it comes to knowing their source, ask the right questions and challenge status quo – not just rely on familiarity and past experiences.

Knowledge and Ability (From adoption to implementation) – In business context, “desire” typically translate into business goals – revenue, process adoption, automation, newer market expansion, launch of a new product/solution, etc. Mere awareness of the changes taking place does not translate into achievements. It needs to be studied and change management needs to be initiated.

But how can you execute your day job and learn to change?

The trick here will be to make analytics seamless; almost second nature. Just as the message alert from the bank about any suspicious transaction made on your account, any deviation from the set course of business action needs to be alerted.

Such technology-assisted decisions are the need of today and the future. Tredence CHA solution is an example in this direction. It is intuitive, convenient and evolving, mirroring aspects of Robotics Process Automation (RPA).

Reinforcement (Stickiness will be key) – Your business problems are yours to know and yours to solve. Like my colleague mentioned in his blog, a one size fits all solution does not exist. Solving the business challenges of today requires going to the root cause of it, understanding the data sources available to you, and being knowledgeable about other data combinations (across the firewall or within) that matter. Match this stream of data with relevant tools and techniques that can give you the “desired” results.

Point to keep in mind during this drill is to ensure that you marry the old and new. Replacing a legacy system with something totally new could leave a bad taste in your mouth – with less adoption and greater resistance. Embedded analytics will be key – one that allows you to seamlessly time travel between the past, present and future.

To conclude, whether it is about time to implement change, improving customer service, reducing inefficiencies, or mitigating the negative effect of volatile markets, Change Management will be pivotal. It is a structured, on-going process to ensure you are not merely surviving, rather thriving in change.

Key to bridging the analytics-software chasm: iterative approach + customized solutions, leading to self-service BI

The world of software development and IT services have operated through well-defined requirements, scope and outcomes. 25 years of experience in software development have enabled IT services company to significantly learn and achieve higher maturity. There are enough patterns and standards…

Key to bridging the analytics-software chasm: iterative approach + customized solutions, leading to self-service BI

Ganesh Moorthy
Ganesh Moorthy
Director – Engineering, Tredence

The world of software development and IT services have operated through well-defined requirements, scope and outcomes. 25 years of experience in software development have enabled IT services company to significantly learn and achieve higher maturity. There are enough patterns and standards that one can leverage in-order to avoid scope-creep and make on-time delivery and quality a reality. This world has a fair order.

It is quite contrary to the Analytics world we operate in. Analytics as an industry itself is a relatively new kid on the block. Analytical outcomes are usually insights generated from historical data viz. a viz. descriptive and inquisitive analysis. With the advent of machine learning, the focus is gradually shifting towards predictive and prescriptive analysis. What usually takes months or weeks in software development usually takes just days in the Analytics world. At best, this chaotic world posits the need for continuous experimentations.

The question enterprises need to ask is “how to leverage the best of both worlds to achieve the desired outcomes?”, “how do we bridge this analytics-software chasm?”

The answers require a fundamental shift in perception and approach towards problem solving and solution building. The time to move from what is generally a PPTware (in the world of analytics) to dashboards and furthermore a robust machine learning platform for predictive and prescriptive analyses needs to be as short as possible. The market is already moving towards this said purpose in the following ways:

  1. Data Lakes – These are on-premise and built mostly with the amalgamation of open source technologies and existing COST software’s – homegrown approach that provides single unified platform for rapid experimentation on data along with capability to move quickly towards scaled solutions
  2. Data Cafes / Hubs – Cloud-based SAAS-based approach that allows everything from data consolidation, analysis to visualizations
  3. Custom niche solutions that serve specific purpose

Over a series of blogs, we will explore the above approaches in detail. These blogs will give you an understanding of how integrated and inter-operable systems rapidly allow you to take your experiments towards scaled solutions, in matter of days and in a collaborative manner.

The beauty and the beast are finally coming together!

SOLUTIONS, WHAT’S NEW?

The cliché in the recent past has been about how industries are racing to unlock the value of big data and create big insights. And with this herd mentality comes all the jargons in an effort to differentiate. Ultimately, it is about solving problems. In the marketplace abstraction of problem solving, there’s a supply side and a demand side…
Sagar Balan
Sagar Balan
Associate Director – Analytics, Tredence

Dell, HP, IBM have all tried to transform themselves from being box sellers to solution providers. Then, in the world of Uber, many traditional products are fast mutating into a service. At Walmart, it is no longer about grocery shopping. Their pick and go service tries to understand more about your journey as a customer, and grocery shopping is just one piece of the puzzle.

There’s a certain common thread that run across all three examples. And it’s about how to break through the complexity of your end customer’s life. Statistics, machine learning, artificial intelligence can’t maketh the life of store managers at over 2000 Kroger stores across the country any simpler. It sounds way too complex.

Before I get to the main point, let me belabor a bit and humor you on other paradigms floating around. Meta software, Software as a Service, cloud computing, Service as a Software… Err! Did I just go to randomgenerator dot com and get those names out? I swear I did not.

The cliché in the recent past has been about how industries are racing to unlock the value of big data and create big insights. And with this herd mentality comes all the jargons in an effort to differentiate. Ultimately, it is about solving problems.

In the marketplace abstraction of problem solving, there’s a supply side and a demand side.

The demand side is an overflowing pot of problems. Driven by accelerating change, problems evolve really fast and newer ones keep popping up. Across Fortune 500 firms, there are very busy individuals and teams running businesses the world over, grappling with these problems. Ranging from store managers in a retail store, to trade promotion manager in a CPG firm, a district sales manager in a pharma firm, a decision engineer in a CPG firm and so on. For these individuals, time is a very precious commodity. Analytics is valuable to them only when it is actionable.

On the supply side, there are complex math (read algorithms), advanced technology and smart people to interpret the complexities. And, for the geek in you, this is a candy store situation. But, how do we make these complex math – machine learning, AI and everything else – actionable?

To help teams/individuals embrace the complexity and thrive in it, nature has evolved the concept of solutions. Solutions aim to translate the supply side intelligence into simple visual concepts. This approach takes intelligence to the edge, thereby scaling decision making.

So, how do solutions differ from products, from meta-software, service as a software and the gibberish?

Fundamentally, a solution is meant to exist as a standalone atomic unit – with a singular purpose of making the lives of decision makers easy and simple. It is not created to scale creation of analytics. For example a solution created to detect anomalies in pharmacy billing will be designed to do just that. The design of this solution will not be affected by the efficiency motivation to apply it to a fraud detection problem as well. Because the design of a solution is driven by the needs of the individual dealing with the problem, it should not be driven by the motivation to scale the creation of analytics. Rather, it should be driven by the motivation to scale the consumption of analytics; to push all the power of machine learning and AI to the edge.

In Tredence you have a partner who can execute the entire analytical value chain and deliver a solution at the end. No more running to the IT department with a deck/SAS/R/Python code, asking them to create a technology solution. Read more about our offerings here.

This blog is the first of the two-part series. The second part will be about spelling the S.O.L.U.T.I.O.N.