Category Archives: BI

SAP Cloud for Analytics: Tilting at Windmills

Back in the early 2000s when I was running marketing at Business Objects, Gartner’s then-lead BI analyst, Howard Dresner (known as the father of BI and the person who named the category) started pushing a notion called enterprise performance management (EPM).  Back then, EPM meant the unification of BI and planning/budgeting.

The argument in favor of EPM made sense and was actually kind of cool:  with BI you could ask any question, but BI never knew the correct answer.  What did that mean?

It meant that BI tools were primarily tied to operational systems and could tell you the value of sales/salesrep for any quarter in any region.  The problem was that BI didn’t know what the answer was supposed to be.  BI knew the cost of everything and the value of nothing.

The solution was tie to BI to financial systems, which were full of targets and thus could allow us not just to know the value of any given metric, but what the value of that metric should be.

It sounded great and I bought in.  More importantly, so did the category:

Then what happened?  In my opinion, pretty much nothing.  Sure Hyperion reps could increase deal sizes by trying to drop Brio licenses across the whole financial department, as opposed to just FP&A.  Cognos could cross-sell Adaytum, with the help of an overlay sales force.

But did integration happen?  No.  BI and financial planning/budgeting  consolidated, but they never converged.  This is interesting because it’s rare.  For example, by contrast, CRM really happened.  SFA vendors didn’t just acquire customer service vendors and marketing vendors — the three applications came together to create one category.

That didn’t happen with EPM.  You could always ask someone who worked at Hyperion my favorite question, “which side did you work on?” and you always heard either, “oh, the BI side,” or “oh, the finance side.”  You never, ever got asked to clarify the question.

Over time, EPM came to mean financial planning, budgeting, and consolidation (along with associated reporting/analytics) — and not the unification of BI and financial planning.

What did this prove?   You can put the two categories under one roof via consolidation, but the actual markets are oil-and-water and don’t mix together well.  Why?  Two reasons:

  • BI is a platform sale, EPM is an applications sale
  • BI is sold to IT, EPM is sold to the finance department

So other than selling to a completely different buyer with a completely different value proposition, they make excellent candidates for integration!  Put concretely, if you can’t talk about inter-company eliminations, AVB reports, AOPs, topside journal entries, long-range models, FX rate handling, and legal entities, then you can’t even start to sell EPM.  I marketed BI for 9 years and we talked a totally different language:  aggregate awareness, multi-pass SQL, slow-changing dimensions, and star schemas.  The two languages are not totally unrelated.  They are nevertheless different.

Despite this history, many vendors still seem hell bent on mixing EPM water with BI oil.  One cloud EPM vendor positioned themselves for years as a leader in “BI and CPM” somehow thinking the rock-bottom acquisition of a cheap scorecarding tool made them a player in the $15B BI market.

To be clear, I view EPM and BI as cousins.  Yes, in EPM we make scorecards, dashboards, and reports.  Yes, in EPM we do multi-dimensional modeling and analysis.  No doubt.  But we do it for finance departments, we tie our planning/budgeting systems to the general ledger and we are focused on both financial outcomes and financial reports.  Yes, we also care about integrating models across the organization — sales, marketing, services, and operations.  But we are not trying to sell generic infrastructure for making reports and visualizations across the enterprise.

Put simply, in EPM we use BI technologies to build financial applications that tie together the enterprise on behalf of the finance department.

Surprisingly, SAP didn’t get the consolidation-not-convergence memo.  This is somewhat amazing given that SAP is a strong player in both BI and EPM, but somehow hasn’t seemed to notice not only that the two markets never converged but also that there is a very good reason for that.  They are still tilting at windmills fighting to integrate two categories not destined for integration with a vintage-2002 message.

Here’s the press release:

SAP Redefines Analytics in the Cloud

WALLDORF — SAP SE (NYSE: SAP) today unveils the SAP Cloud for Analytics solution, a planned software as a service (SaaS) offering that aims to bring all analytics capabilities into one solution for an unparalleled user experience (UX).

Built natively on SAP HANA Cloud Platform, this high-performing, real-time solution plans to be embedded with existing SAP solutions and intends to connect to cloud and on-premise data to deliver planning, predictive and business intelligence (BI) capabilities in one analytics experience. The intent is for organizations to use this one solution to enable their employees to track performance, analyze trends, predict and collaborate to make informed decisions and improve business outcomes.

Note, that in addition to my strategic concerns, I have a few tactical ones as well:

  • This is a futures announcement without a date.  The service “planned.”  The “planned benefits” are stated.  The only thing I can’t find in the plan is an availability date.
  • Pricing hasn’t been announced either.  So other than knowing what it costs and when it will be available, it was an informative announcement.
  • While SAP is claiming that it’s previously announced SAP Cloud for Planning is included in the new offering, I have heard rumors on the street that SAP Cloud for Planning is actually being discontinued and customers will be moved to the new offering.  At this point, I’m not sure which is the case.

In the end, I’m not trying to beat on SAP in general.  I don’t love the Hana branding strategy, that’s true, but Hana itself (i.e., columnar, in-memory database) is a good idea.  I have no problems with SAP BI’s products — heck, my fingerprints still remain lightly on a few of them.  In EPM, we compete with SAP, so my agenda there is obvious.

But the thing I object to, the tilting at windmills, is that they are still banging the unify EPM and BI drum.  SAP’s new analytics may eventually end up a reasonable or good BI solution.  But if they’re betting serious chips on unifying BI and EPM it’s misguided.

Kellblog’s 10 Predictions for 2014

Since it is the season of predictions, I thought I’d offer up a few of my own for 2014, based on my nearly three decades of experience working in enterprise software with databases, BI tools, and enterprise applications.

See the bottom for my disclaimer, and off we go.  Here are my ten predictions for 2014.

  • Despite various ominous comparisons to 1914 made by The Economist, I think 2014 is going to be a good year for Silicon Valley.  I think the tech IPO market will continue to be strong.  While some Bubble 2.0 anxiety is understandable, remember that while some valuations today may seem high, that the IPO bar is much higher today (at around $50M TTM revenues) than it was 13 years ago, when you could go public on $0 to $5M in revenues.  In addition, remember that most enterprise software companies (and many Internet companies) today rely on subscription revenue models (i.e., SaaS) which are much more reliable than the perpetual license streams of the past.  Not all exuberance is irrational.
  • Cloud computing will continue to explode.  IDC predicts that aggregate cloud spending will exceed $100B in 2014 with amazing growth, given the scale, of 25%.  Those are big numbers, but think about this:  some 15 years after Salesforce.com was founded, its head pin category, sales force automation (SFA), is still only around 40% penetrated by the cloud.  ERP is less than 10% in the cloud.  EPM is less than 5% in the cloud.  As Bill Gates once said about prognostication, “we always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten.”  IT is going to the cloud, inexorably, but change in IT never happens overnight.
  • Big Data hype will peak.   I remember the first time I heard the term “big data” (in about 2008 when I was on the board of Aster Data) and thinking:  “wow, that’s good.”  Turns out my marketing instincts were spot on.  Every company today that actually is — or isn’t — a Big Data play is dressing up as one, which creates a big problem because the term quickly starts to lose meaning.  As a result, Big Data today is nearing the peak of Gartner’s hype cycle.  As a term it will start to fall off, but real Big Data technologies such as NoSQL databases and predictive analytics will continue to face a bright future.
  • The market will be unable to supply sufficient Data Science talent.  If someone remade The Graduate today, they’d change  Mr. McGuire’s line about “plastics” to “data science.”  Our ability to amass data and create analytics technology is quickly surpassing our ability to use it.  Job postings for data scientists were up 15,000% in 2012 over 2011.  Colleges are starting to offer data science degrees (for example, Berkeley and Northwestern).  There’s even an a startup, Udacity, specifically targeting the need for data science education.  Because of the scarcity of data science talent, the specialization required to correctly use it, and the lack of required scale to build data science teams, data science consultancies like Palantir and Mu Sigma will continue to flourish.
  • Privacy will remain center stage.  Trust in “Don’t Be Evil” Google and Facebook has never been particularly high.  Nevertheless, it seems like the average person has historically felt “you can do whatever you want with my personal data if you want to pitch me an advertisement” — but, thanks to Edward Snowden — we now know we can add, “and if the government wants to use that data to stop a terrorist attack, then back off.”  It’s an odd asymmetry.  These are complex questions, but in a world where the cost of data collection will converge to free, will the privacy violation be in collecting the data or in analyzing it?  In a world where one trusted the government to adequately control the querying and access (i.e., where it took a warrant from a non-secret court), I’d argue the query standard might be good enough.  Regardless, the debate sparked thus far will continue to burn in 2014 and tech companies will very much remain in the center of it.
  • Mobile will continue to drive consumer companies like Dropbox and Evernote, but also enterprise companies like Box, Clari, Expensify, and MobileIron.  Turns out the enterprise killer app for mobile was less about getting enterprise applications to run on mobile devices and more about device proliferation, uniform access to content, and eventually security and management.  (And since I’m primarily an enterprise blogger, I won’t even mention social à la SnapChat or mobile gaming).  As one VC recently told me over dinner, “God bless mobile.”  Amen in 2014.
  • Social becomes a feature, not an app.  When I first saw Foursquare in 2010, I thought it should be the example in the venture capital dictionary for “feature, not company.”  Location-awareness has definitely become a feature and these days I do more check-in’s on Facebook than Foursquare.  I felt the same way when I worked at Salesforce.com and we were neck deep in the “social enteprise” vision.  When I saw Chatter, I thought “cool, but who needs yet another communications platform.”  Then I realized you could follow a lead, a case, or an opportunity and I was hooked.  But those are all feature use-cases, not application or company use-cases.  Given the pace of Salesforce, they fell in love with, married, and divorced social faster than most vendors could figure out their product strategy.  In the end, social should be an important feature of an enterprise application, almost a fabric built across modules.  I think that vision ends up getting implemented in 2014.  (Particularly if Microsoft ends up putting in David Sacks as its next CEO as some speculate.)
  • SAP’s HANA strategy actually works.  I was one of relatively few people who was absolutely convinced that SAP’s $5.8B purchase of Sybase in 2010 was more about databases than mobile.  SAP is clearly crafting a strategy to move both analytics and transactional database processing onto HANA and they have been doggedly consistent about HANA and its importance to the firm going forward.  They have been trying for decades to eliminate their dependency on Oracle — e.g., the 1997 Adabas D acquisition from Software AG  — and I believe this time they will finally succeed.  In addition, they will succeed — quite ironically — with their ingredient-branding strategy around HANA using a database to differentiate an application suite, something that they themselves would have seen as heresy 20 years ago.
  • Good Data goes public.  Cloud-based BI tools have had a tough slog over the years.  Some good companies were too early to market and failed (e.g., LucidEra).  Birst, another early entrant, certainly hasn’t had an easy time over its ten-year history.  Personally, while I was always a fan of cloud-based applications (having become a big Salesforce customer in 2003), I always worried that with cloud-based BI tools, you’d have too much of the nothing-to-analyze problem.  Good Data got around that problem early on by adopting a Crystal-like OEM strategy, licensing their tools through SaaS applications vendors.  They later evolved to a general cloud-based BI platform and applications strategy.  The company was founded in 2007, has raised $75M in VC, is reportedly doing very well, and an IPO seems a likely event in its future.  I’m calling 2014.
  • Adaptive Planning gets acquired by NetSuite.  Adaptive Planning was founded in 2003 as a cloud-based planning company and — despite both aspirations and claims to the contrary — in my estimation continues to play the role of the low-priced, cheap-and-cheerful planning solution for small and medium businesses.  That market position, combined with an existing, long-term strategic relationship whereby NetSuite resells Adaptive as NetSuite Financial Planning, makes me believe that 2014 will be the year that NetSuite finally pulls the trigger and acquires Adaptive Planning.  I think this deal could go down one of two ways.  If Adaptive continues to perform as they claim, then a potential S-1 filing could serve as a trigger for NetSuite (much as Crystal Decisions’ S-1 served as a trigger for Business Objects).  Or, if Adaptive hits rough road in 2014 for any reason (including the curse of the new headquarters) then that could trigger NetSuite with a value-shopper impulse leading to the same conclusion.

I should end with a bonus prediction (#11) that Host Analytics, our customers, and my colleagues will enjoy a successful 2014, continuing to execute on our cloud strategy to put the E back in EPM — focus and leadership in the enterprise segment of the market — and that we will continue to acquire both high-growth companies who want an EPM solution with which they can scale and liberate enterprises from costly and painful Hyperion implementations and upgrades.

Finally, let me conclude by wishing everyone a Happy New Year and great business success in 2014.

Disclaimers

  • See my FAQ to understand my various allegiances and disclaimers.
  • Remember I am the CEO of Host Analytics so I have a de facto pro-Host Analytics viewpoint.  
  • Predictions are opinion:  I have mine; yours may differ.
  • Finally, remember the famous Yogi Berra quote:  predictions are hard, especially about the future.

Highlights from the 2011 Wisdom of Crowds Business Intelligence Market Study

When I think about technology vendors and industry analysts, an old song from Oklahoma! comes to mind:  The Farmer and the Cowman Should Be Friends.  (“Should be” as in “usually aren’t.”)

That said, in BI we were blessed to have a very strong cast of industry analysts who were both great analysts and great people.  It was a rare (think:  ZL Technologies) case of the farmer and cowman becoming friends.

I’ve stayed in touch with one such cowman, Howard Dresner, even though for the past six years I was out of the BI market.  I came to know him during my 9 years at BusinessObjects where we sat  across the table when Howard was the lead BI analyst at Gartner.

Howard now runs an independent BI Advisory service, Dresner Advisory Services, and as part of that business runs an annual survey that he calls the Wisdoms of Crowds Business Intelligence Market Study.  In this post, I’ll share some highlights from the recently released 2011 study, with some help from a financial analyst report done on it by Frank Sparacino of First Analysis.

One key trend spotted in the report was the continuing evolution of BI purchasing from IT to the business and, as such, a commensurate reorientation of the tools themselves towards that end.  Sparacino says that Gartner has a new name for this class of tool, “data discovery,” and that such tools are characterized by three things:

  • A business-led purchasing cycle
  • A data visualization user interface (as opposed to report or grid)
  • Interactive analysis as the primary use-case (as opposed to reporting or monitoring)

These trends are consistent with those  mentioned in my previous post, Traits of Next-Generation BI.

Sparacino cites QlikTech and Tableau as the poster children for this next generation of BI.  While I am huge fan of Tableau and while both QlikTech and Tableau are definitely on fire, I believe there is a next-next generation that will soon be invading the market, led by companies like the still-stealth EdgeSpring or recently launched Sisense.

In terms of BI spending, the report suggests healthy growth for BI in 2011 but not as strong as the growth in 2010.  It also indicates an increased focus on smaller initial deployments with later follow-on deployments as opposed to big bites.  It also shows that while perpetual licensing remains the dominant model, it is in decline relative to open source and subscription models.

Finally, in the overall rankings department, here are some of the key scores:

  • Tableau:  4.57
  • Dimensional Insight:  4.52
  • Information Builders:  4.29
  • Yellowfin:  4.23
  • Actuate/BIRT:  4.15
  • PivotLink:  4.06
  • QlikTech:  4.02
  • MicroStrategy:  3.91
  • Pentaho:  3.88
  • Jaspersoft:  3.83
  • Oracle 3.71
  • BusinessObjects:  3.30

Because the vendors tend to fall into different buckets with respect to breadth and depth of product line, Dresner groups them into categories for arguably fairer comparisons:

  • Emerging:  Tableau, Dimensional Insight, YellowFin, PivotLink
  • Pure Play:  MicroStrategy, QlikTech, Information Builders
  • Titan:  Oracle, BusinessObjects
  • Open Source:  Actuate/BIRT, Pentaho, Jaspersoft

Traits of Next-Generation BI (Business Intelligence)

I suppose it’s not surprising that on the journey to find my ideal next gig that I’ve seen a lot of next-generation business intelligence (BI) companies.  Because I’ve thus had the chance to immerse myself in the BI startup world, I thought I’d share a quick glimpse of what’s presumably the BI future.

Because some of the companies I’ve seen are still stealth, I’m not going to name any early-stage names, but simply provide a list of common traits of next-generation BI companies.

Traits of next-generation BI:

  • In memory, columnar, and compressed.  Most solutions rely on the fact that the source data for most problems can now fit in memory,  typically using a columnar and compressed format.  Some solutions are even able to perform work on the data without first decompressing it.
  • Fast.  The dream of BI — particularly for interactive analysis tools —  has always been “speed of thought” analysis.  Thanks to the above point and thanks to additional performance optimizations (e.g., to expoit CPU cache locality), this dream is becoming a reality.
  • Directly connected.  Next-generation BI tools generally connect directly to the underlying source databases (and/or the Internet) to capture data.  This means they must also have basic data integration capabilities both so they properly align data from different systems and dynamically refresh it.
  • Schema-free.   In order to accomodate semi-structured information and to be able integrate information from different sytems, next-generation BI does not require the up-front definition of a schema.  Instead, relationships among data (e.g., hierarchy) are discovered dynamically.
  • Beautiful.  While this is best exemplified by Tableau (where visualization is the principal focus) next-generation BI tools generally provide beautiful visualizations that are more powerful than the basic report and bar chart.  (Note that I named a name here because I consider Tableau mid-stage, not early-stage.)
  • Mobile.  Next-generation BI tools typically assume a brower-based client and often the need to create device-specific clients (e.g., a native iPad app) to supplement it.  Some companies focus exclusively on mobile BI.
  • Neutral.  Next-generation BI tools exploit the fact that a multi-billion dollar vacuum was created in the market when the BI leaders were consolidated and became units of IBM (e.g., Cognos) or SAP (e.g., BusinessObjects).

In many ways, next-generation BI takes us full circle back to the days of Cognos PowerPlay and its desktop-resident PowerCube (i.e., hypercube) — except that the cube is now virtual, schema-free, of effectively unlimited size, and contains no precalcuated aggregates.  But like that era, the cube in many ways obviates the data warehouse infrastructure underneath it. After all, if you can fit your entire data set in memory and dynamically calculate the answer to any question at high speed, then why do you need a data warehouse full of precalculated aggregates again?

The answer is “you do” for many cases (e.g., history, data cleansing) — but certainly not for all of them.  I thus see a “middle squeeze” on the data warehouse market in the future.

  • For most applications of normal size and analytic complexity, people will use next-generation BI on top of raw data sources, unless they have very messy data or a need for extensive history.
  • For large applications (i.e., big data) and/or high analytic complexity, people will use advanced analytic platforms (e.g, Aster Data).  This, of course, begs the question whether anyone is  working on BI tools that exploit and optimize the new, high-end analytic engines and the answer to that question is happily “yes” as well.

Thoughts on the Qlik Technologies (QlikTech) IPO

I spent an hour or so browsing the QlikTech S-1 and thought I’d share some observations.  (See here for my prior post on the company.)

  • The company has achieved good scale (2009 revenues of $157M) but growth has been decelerating from 82% in 2007 to 47% in 2008 to 33% in 2009.
  • Gross margins are high at 89% due largely to normal margins on license (96%), unusually  high margins on support (96%), normal margins on consulting (27%), and a fairly small consulting business (10% of total revenues) which reduces the pull-down effect on the weighted average.  Wall Street will like this.
  • Sales and marketing expense is high at 59% of sales.  Provided switching costs are high, you can argue this is a good investment, and provided growth is high, you can justify it.   I’m going to assume they make some “lost year” arguments about 2009 in their story and will guide to re-accelerated growth, but I’m not sure.  If not, then they will get pressure about the inefficiency of their sales model.
  • R&D is spectacularly low at 6% of sales.  There is an argument that if you have a largely completed (cheap and cheerful) BI tool that you should simply go sell the heck out of it and not artificially spend money in R&D when you have neither the vision nor the immediate need to either create new products or investment big money in enhancing your existing one.  I’ve just seen few companies try to make it.  I suspect Wall St. will pressure them to increase this number, regardless of whether it’s the strategically right thing to do for the company.
  • Expanded customer base from 1,500 customers in 2005 to over 13,000 in 2009.
  • I like their argument that because it’s easier to use than traditional BI tools that it should get greater penetration than the average 28% of potential BI users cited by IDC.
  • The unique business model (free downloads and 30-day guarantee post purchase) are consistent with the cheap and cheerful product positioning, which is good.  It does beg the question why sales costs so much, however, if you’re primarily upselling downloaders in a low-commitment fashion.
  • I think the claim “analysis tools are not designed for business users” is over-stated.  I can assure you that at BusinessObjects we were designing products for business users.
  • I dislike the small piece of huge pie argument, but I suppose that particular fallacy is so embedded in human nature that it will never go away.  I’d rather hear that QlikTech thinks its 2010 potential market is $400M and it wants 50% than hear – as it says in the prospectus — that they think it’s $8.6B and they presumably want somewhere around 2%.
  • They expect 63M shares outstanding after the offering, implying that if they want a $10-$15 share price that they think the company can justify a market cap in the $750M to $1B range.  If it were generating more than a 4% return on sales and growing faster than 33% that would be easier to assume.
  • 50% of 2009 license and FYM came from indirect channels.  This again begs the question why sales cost so much; indirect channels are, in theory, more cost-effective than direct.
  • They had 124 “dedicated direct sales professionals” as of 12/31/09, which suggests to me that at an average productivity of $1.8M (including all ramping and turnover effects) they could do $223M in revenues in 2010, or growth in the 40% range.  So they seem well teed-up from a sales hiring perspective.
  • If my US readers are wondering why you’ve not heard of them, it’s because they were originally founded in Sweden and do 77% of revenues “internationally” (which now means outside the US given that they moved their headquarters in 2004).   This relative lack of US presence should presumably hurt the stock.
  • They have a pretty traditional enterprise software business model:  perpetual license and maintenance.  They even state potential demand for SaaS BI as a risk factor.
  • They had $35M in deferred revenue on the balance sheet as of 12/31/09.  This strikes me as high; some quick back-of-the-envelope calculations led me to expect ~$25M if it was all the undelivered portion of pre-paid, single-year maintenance contracts.
  • Per IDC, 44% of QlikView customers deploy within a month and 77% deploy within three months.  It sounds impressive and is consistent with the small consulting business.  But it also depends on the definition of deploy.
  • This is no overnight success story; the company was founded in Sweden in 1993.  There was a six-year product development phase (which perhaps explains the low R&D today) from 1993 to 1999.  From 1999 to 2004 they sold almost exclusively in Europe.  From 2004, they added USA sales and relocated the HQ to Pennsylvania.
  • 2009 maintenance renewal rate of 85%
  • They intend to increase R&D expenses to increase in both absolute dollars and as a percent of sales going forward.
  • 73% of revenues are not dollar denominated.  This means that foreign exchange rates should hit them more (both ways) than for a typical software company.
  • This sounds typical:

Our quarterly results reflect seasonality in the sale of our products and services. Historically, a pattern of increased license sales in the fourth quarter has positively impacted sales activity in that period which can make it difficult to achieve sequential revenue growth in the first quarter. Similarly, our gross margins and operating income have been affected by these historical trends because the majority of our expenses are relatively fixed in the near-term.

  • USA revenues grew at 28% in 2009, a bit slower than company overall. Fairly surprising, given the late USA start and the presumably huge market opportunity.
  • R&D remains in Lund, Sweden with 54 staff as of 12/31/09.
  • 574 total employees as of 12/31/09 with 148 in the USA and 426 outside.
  • Accel is the biggest shareholder with 26.7% of the stock, pre-offering.
  • The proposed ticker symbol is QLIK
  • My brain started to melt around page 120.  (Somehow the document set I managed to pull down from the SEC site is about1,000 pages and includes a zillion appendices.  The regular S-1 is here.)
  • Click on the image below to blow up their recent financials.

Au Revoir LucidEra

Apparently, SaaS BI player LucidEra is shutting down as of the end of this month.

Excerpt:

He [marketing VP, friend, and former Business Objects colleague Darren Cunningham] would not go into details regarding LucidEra’s financial problems other than to say, “It was a matter of funding or being acquired. And neither of those things happened.”

I’ve always thought BI was a particularly difficult category “to SaaS” for a number of reasons:

  • The dependency on external data. Operational apps (e.g., NetSuite, Salesforce) inherently have data associated with them, data that gets loaded initially when you configure the app, and data that gets supplemented every day when you use it. BI is a blank slate that requires data to be useful.
  • The variability of user requirements. There are only so many ways to call a lead, pay an invoice, or promote an employee (i.e., implement use-cases in transactional apps). In BI, just about any question you can imagine is fair game and different people think about things in different ways. This is one reason why BI has seemed to defy “applicationization,” despite repeated attempts from multiple vendors. While you certainly can package some common reports and dashboards, my guess is you’re grabbing only 20% of the requirements, not the 80% you grab when package up transactions.
  • The variability of data. While software industry consolidation is slowly reducing the number of different sources from which BI needs to pull data, BI still needs to pull data from a wide variety of sources. This is a complex problem, made more complex by the need to pull from both SaaS and traditional sources, and serves to undermine both applicationization and multi-tenancy. My hunch is you either (1) need to be highly vertically focused and “force” (e.g.,) all retailers into your retail data warehouse or (2) end up doing custom implementations for each of your customers at both a data integration and data warehousing level, and you become a hosting vendor of custom applications instead of a true SaaS vendor.

Simply put, I think BI’s hard to bottle and you probably need to be very big, very focused (in either a vertical market or application-centricity sense), or both, if you want to succeed.

Interestingly, LucidEra doesn’t blame the category for its own demise:

LucidEra’s decision to shut down was brought about by a lack of funding, not a lack of interest in its products or in SaaS BI as a whole, Cunningham said.

Another friend and former Business Objects coworker, Timo Elliott (who stayed on with SAP post-acquisition), covers the winding-down in his BI Questions blog in this thoughtful post, The End of a LucidEra.

Excerpt:

My position has always been that on-demand business intelligence is an essential part of the market, but that some of the claimed benefits have been over-hyped.

In particular, I don’t think the debate should be about choosing between on-demand and on-premise: customers should be able to seamlessly and easily move between one and the other according to their needs, using the same technology platform.

SAS Acquires Teragram

In a not terribly surprising move SAS announced that it is acquiring Teragram on Monday. Seth Grimes of Intelligent Enterprise covers the announcement here. CMS Watch discusses it here.

This is basically a replay of the $76M Business Objects / Inxight deal last May.

At a trends level, this is about the BI vendors (can you say that anymore? perhaps I need to say BI vendor since SAS is the only independent left) wanting to perform analytics against both structured and unstructured data. For most of the evolution of the BI category, unstructured data was not part of the picture. Only in the past few years, has it really hit radar in the BI community.

The approach these folks typically take is use text mining engines to structure the unstructured data. For example, identify which documents talk about which products. Identify which documents have which tone. Then load that data into the data warehouse so you can run a report that shows sales by week with two more columns added: number of emails to customer support and % negative in tone.

Put differently, when you have a multi-billion dollar data infrastructure and you’re faced with content what do you want to do with it? Turn it into data. This is not a bad start and it does enable more and better dashboards and reports. However, to build really powerful content analytic applications, I believe that you need a specialized server (i.e., an XML server like MarkLogic) to handle the documents.

In many ways, it’s like OLAP. If you wanted basic slice and dice you could build that into a reporting tool. But if you wanted real OLAP — big databases, instant performance, complex analysis — then you needed specialized DBMS — an OLAP server — to do it.