Tuesday, May 1, 2012

The Art of the Possible with Business Analytics


It has been established beyond doubt that data and its analysis can have a huge impact on an organization’s top line and bottom line. Business Analytics helps organizations deliver better business performance in two ways – by optimizing business processes and by helping to innovate. Optimization helps organizations be efficient and effective by taking inefficiencies out of the business processes and focusing on the high impact opportunities. Innovation on the other hand helps organizations by uncovering new customer segments, new product categories, new markets, new business models etc.

The styles of analyzing data are many fold from answering questions like “what is going on?” to “why are the things the way they are?” to “what will happen if I do X or Y?” to “what does the future look like?” Broadly speaking the styles of analytics can be classified into three categories:

·         Exploratory Analysis: The objective of exploratory or investigative analysis is exploration and analysis of complex and varied data – whether structured or unstructured for information discovery.  This style of analysis is particularly useful when the questions aren’t well formed or the value and shape of the data isn’t well understood.

·         Descriptive Analytics: The objective of this style of analysis is to answer historical or current questions like what is going on. why are the things the way they are?. This is the most common style of analysis and here the questions as well as the value and shape of data are well understood.

·         Predictive Analysis: Predictive analysis aims at painting a picture of the future with some reasonable certainty.

So, what’s art of possible with business analytics? It’s the application of the above three styles of analytics to a business scenario for better insights, decisions and results. Let’s try and explain this with an example. Consider this scenario:

You are a Financial Services firm e.g. a large bank and are trying to improve profitability. You read Larry Seldon’s book titled “Angel Customers and Demon Customers” and agree with the findings that 20% of your top customers bring in 80% of the profits and would like to manage you business as a portfolio of customers as opposed to portfolio of products. So, how do you do that? The answer is business analytics.

You can start by using descriptive analytics techniques like operational reports, ad-hoc query, dashboards etc. on data collected from different sources like sales, customer service etc. to determine the profitability of each customer. You can then use predictive analysis techniques like data mining, statistical analysis to further enrich your customer data into profitability segments like high, medium, low and loss making customers. Finally, you can choose different customer service channels like personal banker, phone or ATM to cost effectively serve you customers e.g. a high profitability customer can be served by a personal banker free of charge but if the loss making customer wants a personal banker there will be a charge. Once you have implemented such programs you can use exploratory analysis to gauge the sentiment across social media channels like Facebook and Twitter to see if the programs are working as desired. Better yet you may come up with new innovative business models like mobile banking or online only banking to improve profitability.

That’s the art of possible powered by business analytics. Stay tuned, I intend to publish more examples from different industries to show the art of possible with business analytics.



Thursday, March 29, 2012

Does your analytic solution tell you what questions to ask?

Analytic solutions exist to answer business questions. Conventional wisdom holds that if you can answer business questions quickly and accurately, you can take better business decisions and therefore achieve better business results and outperform the competition. Most business questions are well understood (read structured) so they are relatively easy to ask and answer. Questions like what were the revenues, cost of goods sold, margins, which regions and products outperformed/underperformed are relatively well understood and as a result most analytics solutions are well equipped to answer such questions.
Things get really interesting when you are looking for answers but you don’t know what questions to ask in the first place? That’s like an explorer looking to make new discoveries by exploration. An example of this scenario is the Center of Disease Control (CDC) in United States trying to find the vaccine for the latest strand of the swine flu virus. The researchers at CDC may try hundreds of options before finally discovering the vaccine. The exploration process is inherently messy and complex. The process is fraught with false starts, one question or a hunch leading to another and the final result may look entirely different from what was envisioned in the beginning. Speed and flexibility is the key; speed so the hundreds of possible options can be explored quickly and flexibility because almost everything about the problem, solutions and the process is unknown. 
Come to think of it, most organizations operate in an increasingly unknown or uncertain environment. Business Leaders have to take decisions based on a largely unknown view of the future. And since the value proposition of analytic solutions is to help the business leaders take better business decisions, for best results, consider adding information exploration and discovery capabilities to your analytic solution. Such exploratory analysis capabilities will help the business leaders perform even better by empowering them to refine their hunches, ask better questions and take better decisions. That’s your analytic system not only answering the questions but also suggesting what questions to ask in the first place.
Today, most leading analytic software vendors offer exploratory analysis products as part of their analytic solutions offerings. So, what characteristics should be top of mind while evaluating the various solutions? The answer is quite simply the same characteristics that are essential for exploration and analysis – speed & flexibility. Speed is required because the system inherently has to be agile to handle hundreds of different scenarios with large volumes of data across large user populations. Exploration happens at the speed of thought so make sure that you system is capable of operating at speed of thought. Flexibility is required because the exploration process from start to finish is full of unknowns; unknown questions, answers and hunches. So, make sure that the system is capable of managing and exploring all relevant data – structured or unstructured like databases, enterprise applications, tweets, social media updates, documents, texts, emails etc. and provides flexible Google like user interface to quickly explore all relevant data.
Getting Started
You can help business leaders become “Decision Masters” by augmenting your analytic solution with information discovery capabilities. For best results make sure that the solution you choose is enterprise class and allows advanced, yet intuitive, exploration and analysis of complex and varied data including structured, semi-structured and unstructured data.  You can learn more about Oracle’s exploratory analysis solutions by clicking here.

Wednesday, February 22, 2012

5 Facts that SAP won't tell you about HANA


SAP has been touting HANA as an innovative, breakthrough technology and the next “big thing”. They are aspiring to be the #2 database vendor riding on the HANA hype. Well, time to dig deeper and bring forward 5 facts that SAP won’t tell you about HANA.

#1: HANA is an In-Memory database. So, where’s the innovation?
SAP has positioned HANA as the newest, most innovative category defining offering. However, HANA is an in-memory database which may be a new category for SAP but has existed in the market for years. A quick Wikipedia search on in-memory databases reveals that in-memory databases have been around since the 1990s and today there are 40+ such independent offerings of which HANA is one. Oracle alone has 3 in-memory database offerings with successful products like TimesTen, Berkeley DB and MySQL.  Introduced in 1990’s, Oracle’s TimesTen remains an early innovator and a leader in this space. HANA, introduced in 2011 is the youngest member of the group.

#2: HANA adoption is growing rapidly. So, where’s the growth?
SAP will show numbers like FY 2011 revenues of $200M and 100+ customers to underscore HANA’s rapid customer adoption. Putting these numbers in perspective; Vertica, the largest independent in-memory database vendor before being acquired by HP in 2011, was on track to deliver revenues around $100M with 200+ customers and over 100% YoY growth rate. Oracle remains the leader in data warehouse platform market with FY 2010 revenues of close to $3B and thousands of customers. Given Oracle & Vertica’s impressive performance, HANA’s numbers while good are hardly “rapid”.

#3: HANA is enterprise ready. So, where’s the manageability and reliability?
It takes years to develop and perfect a complex product like database management system. Oracle database has been perfected over 30+ years and billions of dollars in R&D investment. TimesTen has been around for 15 years and is still being aggressively developed and perfected. SAP would like you to believe that HANA is enterprise ready from day one but dig deeper and you’ll find that HANA lacks basic features like clustering, high availability, file system persistence and ACID style transaction integrity support. HANA lacks referential integrity support so there is NO means to ensure the integrity of data stored in a HANA database. HANA’s support for locks and transaction isolation is primitive so multi user concurrency is an issue. Hopefully, you get the picture that HANA is an immature version 1 DBMS which is far from being ready to support mission critical enterprise applications.

#4: HANA is non disruptive. So, where’s plug and play?
HANA has limited support for standard ANSI SQL.  In fact, HANA requires applications to be custom written for it using non-standard SQL. In my view this is a major show stopper. In this day and age where every vendor is working diligently to support openness and application integration via support for services oriented architecture and web services in comes HANA with SAP’s age old vision of closed system with no access to underlying data structures. HANA takes vendor lock in to new levels by limiting your choice of applications, reporting & analysis tools to a few offered by SAP.

#5: HANA is an appliance. So, where’s ease and speed of deployment?
Wikipedia defines computer appliances as consisting hardware and software pre-integrated and pre-configured before delivery to customer, to provide a "turn-key" solution to a particular problem. Benefits of appliances include ease and speed of deployment with lower risk and faster time to value. With HANA you buy hardware, software, networking switches and storage from different vendors. There isn’t a single point of support and with different vendors having markedly different development and upgrade cycles it’s excruciatingly hard to test, configure, certify and update the joint solution.

In conclusion, SAP’s larger than life solution HANA definitely underscores the strategic importance of data management and analysis to organizations but due to limitations highlighted above HANA is far from being ready to support mission critical enterprise applications. Customers should consider mature technologies like TimesTen based Oracle Exalytics and Oracle Exadata for their in-memory analytics needs. 

Tuesday, February 7, 2012

Big Data Analytics – The Journey from Transactions to Interactions


Big Data Defined

Enterprise systems have long been designed around capturing, managing and analyzing business transactions e.g. marketing, sales, support activities etc. However, lately with the evolution of automation and Web 2.0 technologies like blogs, status updates, tweets etc. there has been an explosive growth in the arena of machine and consumer generated data. Defined as “Big Data”, this data is characterized by attributes like volume, variety, velocity and complexity and essentially represents machine and consumer interactions. 

Case for Big Data Analysis

Machine and consumer interaction data is forward looking in nature. This data available from sensors, web logs, chats, status updates, tweets etc. is a leading indicator of system and consumer behavior. Therefore this data is the best indicator of consumer’s decision process, intent, sentiments and system performance. Transactions on the other hand are lagging indicators of system or consumer behavior. By definition leading indicators are more speculative and less reliable compared to lagging indicators; however, to predict the future with any confidence a combination of both leading and lagging indicators is required. That’s where the value of big data analysis comes in, by combining system and consumer interactions and transactions, organizations can better predict the consumer decision process, intent sentiments and future system performance leading to revenue growth, lower costs, better profitability and better designed systems.

So, which business areas will benefit via big data analysis? Think of areas where decision-making under uncertainty is required. Areas like new product introduction, risk assessment, fraud detection, advertising and promotional campaigns, demand forecasting, inventory management and capital investments will particularly benefit by having a better read on the future.

Figure 1: Combination of big data and transactional data delivers better insights and business results


Big Data Analytics Lifecycle
The big data analytics lifecycle includes steps like acquire, organize and analyze. Big data or consumer interaction data is characterized by attributes like volume, velocity and variety and common sources of such data include web logs, status updates and tweets etc. The analytics process starts with data acquisition. The structure and content of big data can’t be known upfront and is subject to change in-flight so the data acquisition systems have to be designed for flexibility and variability; no predefined data structures, dynamic structures are a norm. The organization step entails moving the data in well defined structures so relationships can be established and the data across sources can be combined to get a complete picture. Finally the analysis step completes the lifecycle by providing rich business insights for revenue growth, lower costs and better profitability. Flexibility being the norm, the analysis systems should be discovery-oriented and explorative as opposed to prescriptive.

Getting Started
Oracle offers the broadest and most integrated portfolio of products to help you acquire and organize these diverse data sources and analyzes them alongside your existing data to find new insights and capitalize on hidden relationships. Learn how Oracle helps you acquire, organize, and analyze your big data by clicking here.


Figure 2: Oracle’s engineered system solution for big data analytics

Tuesday, January 24, 2012

Oracle Exalytics Pricing explained - a wonderful product at a wonderful price


Warren Buffet famously said and I quote (with some edits) “It's far better to buy a wonderful company (product) at a fair price than a fair company (product) at a wonderful price”. Conventional wisdom has it – quality doesn’t come cheap. Well in this day and age where conventions are broken every day - time to think differently. What-if the best analytics solution in the world was available at the bargain basement price.

Oracle recently announced the pricing for Exalytics, the industry’s first in-memory analytics machine and as conventional wisdom would have it a number of articles were published with Exalytics pricing in millions of dollars of range. But once again continuing with the glowing tradition of “WHY NOT “, Oracle is out to prove the conventional wisdom wrong. Drum rolls please….NOW YOU CAN GET EXALYTICS FOR MUCH LOWER THAN THE MILLION DOLLAR MARK. No gimmicks, no discounts, all based on the list price.
Exalytics includes 3 components – hardware, software and support.
Hardware Cost:
1)      The List price for Exalytics hardware is : $135,000
Software Cost:
Exalytics includes two software components:
2)      TimesTen In-Memory Database for Exalytics: Priced at $300 per named user/ 100 user minimum OR $34,500 per processor
3)      Oracle BI Foundation Suite: Priced at $3,675 per named user/100 user minimum OR $ 450,000 per processor       
Support Cost:
4)      Annual support cost includes support for Exalytics hardware ($ 29,700), TimesTen In-Memory Database for Exalytics ($66 per user OR $7,590 per processor), Oracle BI Foundation Suite ($808.50 per user OR $99,000 per processor).

So, what’s the total cost of deploying a analytic system with 100 users?
Exalytics Cost for a 100 user system = 1 + 100*2 + 100*3 + 4 = $135,000 + 100*$300 + 100* $3,675 + ($29,700 + 100*$66 + 100*808.50) = $135,000 + $30,000 + $367,500 + $117,150 = $649,650

Now, discounts of up to 50% are quite common in the software world. I don’t know how much Oracle discounts but assuming a conservative 50% discount rate we are looking at a 100 user system powered by 1 TB of RAM and 40 CPU cores and market leading BI and In-Memory database technology for under $300 K. That’s $3K per user. Compare this to reoccurring $3K per user per year Salesforce charges for their Sales Cloud and the $5K per month pricing offered by a cloud based BI provider.

In this day and age where we are moving away from “Whys” to “Why Nots”, I think Oracle Exalytics definitely proved the conventional wisdom wrong by delivering best value for best price. Well this would even make Warren Buffet revise his quote – “a wonderful company (product) at a wonderful price”.


Thursday, November 17, 2011

Introducing the Industry's First Analytics Machine, Oracle Exalytics


Analytics is all about gaining insights from data for better decision making. The
business press is abuzz with examples of leading organizations across the world using
data-driven insights for strategic, financial and operational excellence. A recent study on
“data-driven decision making” conducted by researchers at MIT and Wharton provides
empirical evidence that “firms that adopt data-driven decision making have output and
productivity that is 5-6% higher than the competition”. The potential payoff for firms can
range from higher shareholder value to a market leadership position.

However, the vision of delivering fast, interactive, insightful analytics has remained elusive
for most organizations. Most enterprise IT organizations continue to struggle to deliver
actionable analytics due to time-sensitive, sprawling requirements and ever tightening
budgets. The issue is further exasperated by the fact that most enterprise analytics
solutions require dealing with a number of hardware, software, storage and networking
vendors and precious resources are wasted integrating the hardware and software
components to deliver a complete analytical solution.

Oracle Exalytics In-Memory Machine is the world‟s first engineered system specifically
designed to deliver high performance analysis, modeling and planning. Built using
industry-standard hardware, market-leading business intelligence software and in-memory
database technology, Oracle Exalytics is an optimized system that delivers answers to all
your business questions with unmatched speed, intelligence, simplicity and manageability.

Oracle Exalytics's unmatched speed, visualizations and scalability delivers extreme
performance for existing analytical and enterprise performance management applications
and enables a new class of intelligent applications like Yield Management, Revenue
Management, Demand Forecasting, Inventory Management, Pricing Optimization,
Profitability Management, Rolling Forecast and Virtual Close etc.

Requiring no application redesign, Oracle Exalytics can be deployed in existing IT
environments by itself or in conjunction with Oracle Exadata and/or Oracle Exalogic to
enable extreme performance and best in class user experience. Based on proven hardware,
software and in-memory technology, Oracle Exalytics lowers the total cost of ownership,
reduces operational risk and provides unprecedented analytical capability for workgroup,
departmental and enterprise wide deployments.

Click here to learn more about Oracle Exalytics.

Wednesday, October 12, 2011

Five ways Oracle Exalytics is “different” from SAP HANA


The social media platforms are abuzz with comparisons between Oracle Exalytics & SAP HANA. Some of our esteemed colleagues from the other side have tried hard but failed miserably to differentiate between Exalytics & HANA. Frankly, you don’t need to be a PHD or a wine connoisseur to understand the differences; all you need is to invest some time. Here are the top 5 ways (David Letterman style) of how HANA imitates to be Exalytics but fails miserably:

#5: HANA is an “appliance”, Exalytics is an engineered solution: Ok, I am being generous here by classifying HANA as an appliance. My definition of appliance is hardware & software put together for a specific purpose. So, may be HP & Microsoft joined forces to offer a BI appliance but a piece of software running on a bunch of supported hardware platforms without any specific purpose, which HANA is, doesn’t qualify as an appliance. Exalytics on the other hand is an engineered solution. Engineered solution is one which is purpose built to solve a specific problem, think dental braces vs. metal wires. Exalytics is hardware and software which is purpose built to best handle analytical workloads. Unlike an appliance, Exalytics’ software has been designed from the ground up to best exploit the underlying hardware. Features like in-memory, parallelization, automated intelligent cache management, compression etc.  deliver the best analytics performance and exploit the abundance of memory and processing capability available on Exalytics.

#4: HANA does everything but analytics and Exalytics is purpose built for analytics: HANA takes on a new purpose depending on the day of the week. HANA is an analytics database today. In the future, it will be a transactional database. Well there’s nothing wrong in being opportunistic and finding a new purpose every day, experimentation is good. I would certainly like my 4 year old to experiment and figure out if he wants to be an astronaut, a scientist or an artist but how would you feel about experimenting with your mission critical business systems. My advice, please don’t innovate for the sake of innovation.
Exalytics, on the other hand, does only one thing – it delivers unmatched performance and user experience for speed of thought analytics. It super charges your existing BI deployments and enables a new category of smart analytical applications like yield management, revenue management, real time forecasting, virtual financial close, dynamic pricing etc. It integrates transparently into your existing IT environment and requires no manual data movement or costly changes to your application code or behavior – now this is true innovation without disruption.

#3: HANA solves the problem which Exadata solved 2 years ago: HANA is supposedly a database query accelerator. It makes the database queries run faster. Dah, a ground breaking innovation from SAP, well SAP welcome to the party, you are just 2+ years late. Oracle solved the database query acceleration problem 2+ years ago with introduction of Exadata. We can debate the technical nuances like in-memory etc. but with 2TB of RAM and innovations around storage and data access, Exadata remains the fastest database machine on the planet.

#2: HANA is closed; Exalytics is an open solution: HANA is designed to work with SAP data and tools only. Now this is Database 101, you don’t design a database that is closed. The basic concept of a DBMS is to act as a data consolidation platform where data can be collected, stored, managed and accessed openly. OK, you can’t really fault the SAP development here; after all they are designing a version 0.1 of the product which Oracle has been investing billions for the last 30 years to perfect. Exalytics is a completely open middle-tier analytics platform. It connects to any or all commercially available databases like Oracle, DB2, SQL Server, Teradata, Netezza and even SAP HANA and delivers high speed reporting and analysis. Besides, over 40+ pre-built enterprise performance management, ERP and CRM analytical applications are already certified on Exalytics.

#1: No BI software included with HANA; Exalytics is a single stop BI solution: In order to make sense of data or as the former CEO of Business Objects, Bernard Liautaud aptly put it to derive intelligence out of data; BI tools like dashboards, reports, scorecards and ad-hoc query and analysis tools are required. Looks like our able colleagues at SAP forgot this minor detail and didn’t include any BI tools with HANA. Exalytics on the other hand comes preloaded with Oracle BI Foundation. Oracle BI Foundation delivers the widest and most robust set of reporting, ad hoc query and analysis, OLAP, dashboard, and scorecard functionality with a rich end user experience that includes visualization, collaboration, alerts and notifications, search and mobile access. So, all you do with Exalytics is plug it in, connect to a data source and you are on your way to delivering pre-packaged or custom analytical applications.

Hopefully the above provides insightful context on Exalytics and its imposter SAP HANA. Exalytics is and remains the industry’s first in-memory analytics machine. You can learn more about Oracle Exalytics here