Business Intelligence Clouds – The Skies the Limit

I am back…(for now, or so it seems these days) – I promise to get back to one post a month if not more.

Yes, I am known for my frequent use of puns, bordering on the line between cheesy and relevant. Forgive the title. It has been over 110 days since I last posted, which for me is a travesty. Despite my ever growing list of activities both professional and personally, I have always put my blog in the top priority quadrant.

Enough ranting…I diverged; and now I am back.

Ok, cloud computing (BI tools related) seems to be all the rage. Right up there with Mobile

BI, big data and social. I dare use my own term coined back in 2007 ‘Social Intelligence’ as now others have trade marked this phrase (but we, dear readers, know the truth –> we have been thinking about the marriage between social networks / social media data sets and business intelligence for years now)…Alas, I diverge again. Today, I have been thinking a lot about cloud computing and Business Intelligence.

Think about BI and portals, like Sharepoint (just to name 1)…It was all of the rage (or perhaps, still is)…”Integrate my BI reporting with my intranet / portal /Sharepoint web parts…OK, once that was completed successfully, did it buy much in terms of adoption or savings or any number of those ROI / savings catch – “Buy our product, and your employees will literally save so much time they will be basket weaving their reports into TRUE analysis'” What they didnt tell you, was that more bandwidth meant less need for those people, which in turn, meant people went into scarcity mode/tactics trying to make themselves seem or be relevant…And I dont fault them for this…Companies were not ready or did not want to think about what they were going to do with the newly freed up resources that they would have when the panacea of BI deployments actually came to fruition…And so, the wheel turned. What was next…? Reports became dashboards; dashboards became scorecards (became the complements for the former); Scorecards introduced proactive notification / alerting; alerting introduced threshold based notification across multiple devices/methods, one of which was mobile; mobile notification brought the need for mobile BI –> and frankly, and I will say it: Apple brought us the hardware to see the latter into fruition…Swipe, tap, double tap –> drill down was now fun. Mobile made portals seem like child’s play. But what about when you need to visualize something and ONLY have it on a spreadsheet?

(I love hearing this one; as if the multi-billion dollar company whose employee is claiming to only have the data on a spreadsheet didnt get it from somewhere else; I know, I know –> in the odd case, yes, this is true…so I will play along)…

The “only on a spreadsheet” crowd made mobile seem restrictive; enter RoamBI and the likes of others like MicroStrategy (yes, MicroStrategy now has a data import feature for spreadsheets with advanced visualizations for both web and mobile)…Enter Qlikview for the web crowd. The “I’m going to build-a dashboard in less than 30 minutes” salesforce “wait…that’s not all folks….come now (to the meeting room) with your spreadsheet, and watch our magicians create dashboards to take with you from the meeting”

But no one cared about maintenance, data integrity, cleanliness or accuracy…I know…they are meant to be nimble, and I see their value in some instances and some circumstances…Just like the multi-billion dollar company who only tracks data on spreqadsheets…I get it; there are some circumstances where they exist…But, it is not the norm.

So, here we are …mobile offerings here and there; build a dashboard on the fly; import spreadsheets during meetings; but, what happens when you go back to your desk and have to open up your portal (still) and now have a new dashboard that only you can see unless you forward it out manually?

Enter cloud computing for BI; but not at the macro scale; let’s talk , personal…Personal clouds; individual sandboxes of a predefined amount of space which IT has no sanction over other than to bless how much space is allocated…From there, what you do with it is up to you; Hackles going up I see…How about this…

Image representing Salesforce as depicted in C...
Image via CrunchBase

Salesforce.com –> The biggest CRM cloud today. And for the last many years, SFDC has

enbraced Cloud Computing. And big data for that matter; and databases (database.com in fact) in the cloud…Lions and tigers and bears, oh my!

So isnt it natural for BI to follow CRM into cloud computing ?? Ok, ok…for those of you whose hackles are still up, some rules (you IT folks will want to read further):

Rules of the game:

1) Set an amount of space (not to be exceeded; no matter what) – But be fair and realistic; a 100 MB is useless; in today’s world, a 4 GB zip drive was advertised for $4.99 during the back to school sales, so I think you can pony up enough to help make the cloud useful.

2) If you delete it, there is a recycling bin (like on your PC/Mac); if you permanently delete it, too bad/so sad…We need to draw the line somewhere. Poor Sharepoint admins around the world are having to drop into STSADM commands to restore Alvin Analyst’s Most Important Analysis that he not only moved into recycling bin but then permanently deleted.

3) Put some things of use in this personal cloud at work like BI tools; upload a spreadsheet and build a dashboard in minutes wiht visualizations like the graph matrix (a crowd pleasure) or a time series slider (another crowd favorite; people just love time based data ūüôā But I digress (again)…

4) Set up BI reporting on the logged events; understand how many users are using your cloud environment; how many are getting errors; what and why are they getting errors; this simple type of event based logging is very informative. (We BI professionals tend to overthink things, especially those who are also physicists).

5) Take a look at what people are using the cloud for; if you create and add meaningful tools like BI visualizations and data import and offer viewing via mobile devices like iPhone/iPad and Android or web, people will use it…

This isnt a corporate iTunes or MobileMe Cloud; this isnt Amazon’s elastic cloud (EC2). This is a cloud wiht the sole purpase of supporting BI; wait, not just supporting, but propelling users out of the doldrums of the current state of affairs and into the future.

It’s tangible and just cool enough to tell your colleagues and work friends “hey, I’ve got a BI cloud; do you?”

BIPlayBook.Com is Now Available!

As an aside, I’m excited to announce my latest website: http://www.biplaybook.com is finally published. Essentially, I decided that you, dear readers, were ready for the next step.¬† What comes next, you ask?

After Measuring BI data –> Making Measurements Meaningful –> and –>Massaging Meaningful Data into Metrics, what comes next is to discuss the age-old question of ‘So What’? & ‘What Do I Do About it’?

BI PlayBook offers readers the next level of real-world scenarios now that BI has become the nomenclature of yesteryear & is used by most to inform decisions. Basically, it is the same, with the added bonus of how to tie BI back into the original business process, customer service/satisfaction process or really any process of substance within a company.

This is quite meaningful to me because so often, as consumers of goods and services, we find our voices go unheard, especially when we are left dissatisfied. Can you muster the courage to voice your issue (dare I say, ‘complain’?) using the only tools provided: poor website feedback forms, surveys or (gasp) relaying our issue by calling into a call center(s) or IVR system (double gasp)? I don’t know if I can…

How many times do we get caught in the endless loop of an IVR, only to be ‘opted-out’ (aka – hung up on) when we do not press the magical combination of numbers on our keypads to reach a live human being, or when we are sneaky, pressing ‘0’ only to find out the company is one step ahead of us, having programmed ‘0’ to automatically transfer your call to our friend:¬† ‘ReLisa Boutton’ – aka the Release Button().

Feedback is critical, especially as our world has become consumed by social networks. The ‘chatter’ of customers that ensues, choosing to ‘Like’ or join our company page or product, or tweet about the merits or demerits of one’s value proposition, is not only rich if one cares about understanding their customer. But, it is also a key into how well you are doing in the eyes of your customer. Think about how many customer satisfaction surveys you have taken ask you whether or not your would recommend a company to a friend or family member.

This measure defines one’s NPR, or Net Promoter Rank, and is a commonly shared KPI or key performance indicator for a company.

Yet, market researchers like myself know that what a customer says on a survey isn’t always how they will behave. This discrepancy between what someone says and what someone does is as age-old as our parents telling us as children “do not as I do, but as I say.” However, no longer does this paradigm hold true. Therefore, limiting oneself by their NPR score will restrict the ability to truly understand one’s Voice of the Customer. And further, if you do not understand your customer’s actual likelihood to recommend to others or repeat purchase from you, how can you predict their lifetime value or propensity for future revenue earnings? You can’t.

Now, I am ranting. I get it.

But I want you to understand that social media content that is available from understanding the social network spheres can fill that gap. They can help you understand how your customers truly perceive your goods or services. Trust me, customers are more likely to tweet (use Twitter) to vent in 140 characters or less about a negative experience than they are to take the time to fill out a survey. Likewise, they are more likely to rave about a great experience with your company.

So, why shouldn’t this social ‘chatter’ be tied back into the business intelligence platforms, and further, mined out specifically to inform customer feedback loops, voice of the customer & value stream maps, for example?

Going one step further, having a BI PlayBook focuses the attention of the metric owners on the areas that needs to be addressed, while filtering out the noise that can detract from the intended purpose.

If we are going to make folks responsible for the performance of a given metric, shouldn’t we also help them understand what is expected of them up front, as opposed to when something goes terribly wrong, signified by the “text message” tirade of an overworked CEO waking you out of your slumber at 3 AM?

Further, understanding how to address an issue, who to communicate to and most importantly, how to resolve and respond to affected parties are all part of a well conceived BI playbook.

It truly takes BI to that next level. In fact, two years ago, I presented this very topic at the TDWI Executive Summit in San Diego (Tying Business Processes into your Business Intelligence). While I got a lot of¬† stares ala ‘dog tilting head to the side in that confused glare at owner look’, I hope people can draw back on that experience with moments of ‘ah ha – that is what she meant’ now that they have evolved ( a little) in their BI maturation growth.

Gartner BI Magic Quadrant 2011 – Keeping with the Tradition

Gartner Magic Quadrant 2011

Gartner Magic Quadrant 2011

I have posted the Gartner Business Intelligence ‘BI’ Magic Quadrant (in addition to the ETL quadrant) for the last several years.¬† To say that I missed the boat on this year’s quadrant is a bit extreme folks, though for my delay, I am sorry. I did not realize there were readers who counted on me to post this information each year.¬† I am a few months behind the curve on getting this to you, dear readers.¬† But, what that said, it is better late, than never, right?

Oh, and who is really ‘clocking’ me anyway, other than myself? But that is a whole other issue for another post, some other day.

As an aside, am excited to say that my latest websites http://www.biplaybook.com is finally published. Essentially, I decided that the next step after Measuring BI data, Making the Measurements Meaningful, and Modifying Meaningful Data into Metrics was to address the age old question of ‘So What’? Or ‘What Do I Do About it’?

BI PlayBook offers readers real-world scenarios that I have solved using BI or data visualizations of sorts, but with the added bonus, of how to tie it back into the original business process you were reporting on or trying to help with BI, or tie back into the customer services/satisfaction process. This latter one is quite meaningful to me, because so often, we find our voices go unheard, especially when we complain to large corporations via website feedback, surveys or (gasp) calling into their call center(s). Feedback should be directly tied back into the performance being measured whether it is operational, tactical, managerial, marketing, financial, retail , production and so forth. So, why not tie that back into your business intelligence platforms using feedback loops and voice of the customer maps /value stream maps to do so.

Going one step further, having a BI PlayBook allows end users of your BI systems who are signed up and responsible for metrics being visualized and reported out to the company to know what they are expected to do to address a problem with that metric, who they are to communicate both the issue and the resolution to, and what success looks like.

Is it really fair of us, BI practitioners, to build and assign responisble ownership to our leaders of the world, without giving them some guidance (documented of course), on what to do about these new responsibilities? We are certainly the 1st to be critical when a ‘red’ issue shows up on one of our reports/dashboards/visualizations. How cool would it be to look at these red events, see the people responsible getting alerted to said fluctation, and further, seeing said person take appropriate and reasonable steps towards resolution? Well, a playbook offers the roadmap or guidance around this very process.

It truly takes BI to that next level. In fact, two years ago, I presented this very topic at the TDWI Executive Summit in San Diego (Tying Business Processes into your Business Intelligence). The PlayBook is the documented ways and means to achieve this outcome in a real-world situation.

“LAURA” Stratification: Best Practice for Implementing ‘Social Intelligence’

Doing an assessment for how and where to learn social media to better understand your business drivers can be daunting, especially when you want to overlay how those drivers affect your goals, customers, suppliers, employees, partners…you name it.

I came up with this process which happens to mimic my name (shameless self-persona plug) to ease the assessment process while providing a guided assessment plan.

First, ‚ÄėLearn‚Äô to Listen: learning from the voice of the customer/supplier/partner is an extremely effective way to understand how well you are doing retaining, acquiring or losing your relationships with those who you rely on to operate your business.

Second, Analyze what matters, ignore or shelve (for later) what doesn’t; data should be actionable, (metrics in your control to address), reporting key performance indicators that are tied to corporate strategies and goals to ensure relevancy.

Third, Understand your constituent groups; it isn’t just your customers, but also your shareholders, employees, partners, and suppliers who can make or break a business through word of mouth and social networking.

Fourth, Relate your root causes to your constituents value perceptions, loyalty drivers and needs to ensure relevancy flow through from step 2. Map these to your business initiatives and goals exercise from step 2. Explore gaps between initiatives, value perceptions, loyalty drivers and corporate goals.

Lastly, create Action plans to address the gaps discovered in Step 4. If you analyzed truly actionable data in step 2, this should be easy to do.

To apply this to social media in order to turn it into social intelligence, you need to make the chatter of the networks meaningful and actionable.

To do this, think about this example:

 

A person tweets a desire to stop using a hotel chain because of a bad experience. In marketing, this is known as an ‚Äúintent to churn‚ÄĚ event; when social intelligence reporting systems ferrets out this intent based on scouring the web commentaries of social networks, an alert can be automatically forwarded to your customer loyalty, marketing/social media or customer response teams to respond, address and retain said customer.

A posting might say ‚Äútrouble with product or service‚ÄĚ ‚Äď That type of message can be sent to customer operations (service) or warranty service departments as a mobile alert.

And a ‚Äúhaving trouble replenishing item; out of stock‚ÄĚ question on a customer forum can be passed along to your supply chain or retail teams — all automatically.

The Wynn has a great feedback loop using social media to alert them in real-time of customers who are dissatisfied with their stay who Tweet or comment about this during their stay.

The hotel manager and response time will find this person to address and rectify the situation before they check out. And before long, the negative tweet or post is replaced by an even more positive response, and best of all, WORD of MOUTH to friends and family.

Its sad to say, in this day and age, we are often left without a voice or one that is heard by our providers of services / products. When good service comes, we are so starved that we rejoice about it to the would. And why not? That is how good companies excel and excellent companies  hit the echelon of amazing companies!

Applying the Bing Decision-Engine Model to ‚ÄúBusiness Intelligence‚ÄĚ and Other Musings

Yes, folks, I am back. Wait, didn’t I write that before.

Well, after having my 1st child, I spent many months (just shy of 10, to be exact), noodling business intelligence, and the concepts I had previously discussed on my blog. For the last 5 years, I have been touting the need for better search integration, offering up the BI mashup concept before people really understood what a plain vanilla dashboard was, and was met by glazed stares and confusion. Now that folks are catching on to the iGoogle experience, and the ability to ‚Äúmashup‚ÄĚ or integrated several points of interest or relevance into a dashboard, I want to discuss this topic again. But, this time, I want to apply the concept of the Decision Engine instead of just the Search Engine when it comes to ways to make BI content more meaningful, more relevant and more useful to end users.

Side note: ‚Äúmashup‚ÄĚ is still not a recognized word in the spell-check driven dictionary lists¬†for the greater population of enterprise applications.

Coupled with my mashup passion was my belief in eye-tracking studies. Eye-tracking measures the human behavior of looking at something and measuring the concentration of the eyes on a particular area of a particular object of interest, say a website for example. In the case of business intelligence, I applied eye-tracking studies to the efficacy of dashboard design in order to better understand the areas where the human brain focused concentration vs. those ignored (despite what the person might say was of interest to them).

Advertisers have known about eye-tracking studies for years, and have applied the results to their business. For example, the eyes will focus on the top left corner first. Whether a TV screen, a book, a piece of paper or a dashboard. It is the area of the greatest concentration. Therefore, special importance has been paid to the piece of advertising real estate. And since the popularity rise of folks like Stephan Few of recent or Edward Tufte, whose design principles for effective dashboard design have driven many a BI practitioner to rethink the look and feel of what they are designing, this concept of top left is more important has become commonplace.

And, the handful of other book grade principles have risen to the surface too: less is more when it comes to color, overuse of lines in graphs is distracting, save pie for desert (pie charts, that is), etc.¬† But tying it all together is another story all together. Understanding how human perception, visual perception and dashboard design meet is a whole other can of worms, and usually requires a specialized skill set to fully ‚Äúgrok‚ÄĚ (sorry, but I love Heinlein‚Äôs work). ūüôā

Excuse my digression…


Take a look at this image which shows eyetracking results from the three most popular search engines in 2006:

 

Notice the dispersion of color measured in the Yahoo and MSN examples vs. Google. This is correlated to the relevancy of the results and content presented on the page. And 4 years ago, Google‚Äôs search engine was a popular go-to tool for many when it came to finding related websites to help answer questions. Fast forward 4 years, and MSN is now Bing, and what was the search engine is now the dubbed ‚Äúdecision engine.‚ÄĚ

The advent of the decision-engine in my eyes is because of the dilution of search engine effectiveness based on the flood of results presented to end users. In fact, I am sure the results of an eye tracking study from 2010 would be vastly different as a result of the exponential growth of web-based content available for crawling.

The same has occurred within enterprise business-intelligence platforms. What was introduced as powerful has really become inundated with content, in the form of reports, objects, dimensions, attributes, attribute elements, actual metrics, derived metrics and the list goes on and on.

Superficially, search was introduced as an add-on to the enterprise BI platforms. An add-on; really, an afterthought.

To the credit of the solutions on the market (grouped into a collective unit), people didn’t realize what they didn’t or better put, needed to know when building the technology behind their solution offerings. And they needed to start somewhere. It was only after BI became more mass-adopted in corporate America, and the need grew pervasive into even the smallest Mom and Pop shop for some level of reporting, that people began to realize that getting the visualizing the data was one thing; finding the results of those visualizations or data extractions was an entirely different can of worms.

At the same time as this was happening, the search giants started innovating and introducing the world to the concepts of real-time search and the ‚Äúdecision engine‚Ä̬†named Bing. Understanding the statistical models behind how search algorithms work, even simplistically, understanding enough to be¬†dangerous, is a key that any reader of this blog and any BI practitioner would be smart to invest their own¬†time into doing.¬†

In a nutshell, my belief? Applying those principles and eons of dollars thrown at optimizing said models (by the search giants) is an effective way for BI solutions at any level to leverage the work done to advance search research and technology, instead of just patching BI platforms with ineffective search add-ons. Just look back at the Golden Triangle study graphic above, and remember that long before BI design experts like Tufte and Few said it, advertising gurus knew that the Top Left real estate of any space is the most important space to reach end users. So, instead of thinking of search as a nice add-on for your BI platforms, why not see it as a necessity. if a report is loaded into a repository and no one knows about it, was it ever really done? Let alone meaningful or valuable enough to be adopted by your end users? Think about it…

Talking about Getting Started \ Processing 1.0

 

Quote

Getting Started \ Processing 1.0

Gotta ask my audience for commentary on this one…How many of you are using Processing 1.0 environment/language to build your complex data visualizations?

Processing.org quotes it as "Processing is a simple programming environment that was created to make it easier to develop visually oriented applications with an emphasis on animation and providing users with instant feedback through interaction." (http://processing.org/learning/gettingstarted/)

I have been using this app since college and being a BI professional services/developer now, I tend to overlook the simplicity and ease of use of the Processing language, functions and environment (PDE).

Has anyone else used it for building data visualizing?

Strategic Business Intelligence in Times of Economic Turmoil

Ideas for business intelligence practitioners to forge ahead with their BI initiatives in times of economic turmoil – To pursue best-in-class business intelligence and data management without incurring the wrath of the monolithic centralized platforms built when times were marked by economic growth in most revenue bearing verticals – Can this same race hold its pace with its same velocity and momentum when the economy shifts winds against the runners? In this, larger than life uber-BI applications race, marked in the last 13 – 24 months by a high rate of BI mergers and acquisitions, one has to wonder what will happen when the dust settles and the acquired folks realize they are no more a part of the organization that was the little guy you bought out way back when. Imagine how ProClarity felt when giant Microsoft came a callin’ – Or, when SAP acquired BusinessObjects; did it become: B-I-C ERP meet B-I-C, well, BI.

What about the true R & D exploratory labs like Google Labs, who churns out some interesting advancements in the technology space, offering APIs and SDKs for free to the bold and daring willing to take them up on their offer (oh, and that’s no wilting flower of a number folks…Google had a cap of 10000 invitations when their App Store went live in the Spring of 2008. Some of the cool new data warehousing appliances or the process changes that came about with Master Data Management came about from the die harder open source fans who wanted to bring some structure to the masses without the cost of enterprise platforms with their clunky deployment paths and costly upgrades. Let’s not forget, the Adobe flash-frenzied dashboard users now introduced to the presentation layer worthy interactive dashboard gauges that made mouths salivate the first through third time viewing it… As was expected, vendors tried to update their applications to mirror such interactivity and integration with the MS Office stacks, though Xcelsius still corners the presentation layer market by far. (Open source has some cool contenders especially when it comes to data visualizations) as the race to the dashboard 2.0 space moves into the collaborative world of social networks.

No, there really is a Santa Claus, and I, too, am still smitten with PerformancePoint Planning !! PPS truly rocks the product placement in this arena, much harder to appeal to that broad category of stuffy Financial Budgeting and Planning CFOs, and the likes.

So I ask, with the downturn in the economy, can such advancements be made in, as clearly demonstrated, a capitalized business intelligence software industry , whose recent growth spurts marked a growing sense of entitlement yet with subpar execution and results upon implementation, where services and solutions costs drove positive spikes in software sales, and vice versa? In fact, an interdisciplinary and while highly debated, interdependency exists between the BI, social network, collaboration and portals with custom embedded BI apps, web services and more all geared with one goal in mind: to optimize in a cost effective manner in an effort to drive better, more data driven decision making ? Or is this another blue skies and apple pie dream manifested by one girl’s love for business intelligence

Enterprise Architecture Got You Down?

Try this simplified toolkit approach based on standards defined by the Federal Enterprise Architecture (FEA) board, along with NASCIO:

 

Performance Reference Model (PRM) ‚ÄĘInputs, outputs, and outcomes ‚ÄĘUniquely tailored performance indicators¬†¬†

In this category, you should immediately think Scorecard (Balanced Scorecard and otherwise),:

–Each scorecard have 4-6 perspectives which are logical / categorical groupings of key indicators, or what I like to call ‘affinitized’ KPIs.

–Each perspective has < 6-7 KPIs (Key Performance Indicators) (if you receive pushback, and you will, as people would define a KPI for the percent of time the Express Grocery queue contains purchasers with more than the specified limit of 12 ‚Äď 15 items, doll it up for the BBB as a complaint, and deliver it with such ferver one almost winces when they realize said complaint is recycled faster than their next trip to the grocery store

¬†¬†¬†¬†–Remember the 3 keys to success with defining KPIs: are they actionable, are they measureable (not in a future state, but today can you measure it) and will they drive a performance based behavior change (think incentives, as they represent a perfect example of a performance based behavior changer).

‚Äď So, ask the question now‚Ķ "That’s a long list, John or Jane Exec‚ĶFirst, what are you going to do with that information [the so-what question distinguishing actionable from interesting]‚Ķcan you really drive performance improvements in your business with more than 5 indicators in the case where all 5 go red at the same time or if you don’t want to be direct, you can ask "how do those KPIs link to your performance objectives personally ‚Äď Any leader worth their title will take the time to align the activities and work tasks that they personally, or through delegation, list in their performance reviews.

–This process can be started at any layer in an organizational hierarchy and is called Goals / Objectives Alignment or Cascaded KPIs

–Most leaders you ask have no more than a few KPIs that they ACTUALLY care about ‚Äď just over the hurt feelings now, or feeling that you have wasted x number of hours measuring and reporting on metrics that top leadership doesn’t care about; if you have been in BI long enough, you will have experienced this at least once in your career.

 

Business Reference Model (SRM) ‚ÄĘ Lines of Business (functions and sub-functions) ‚ÄĘ Agencies, customers, partners

 

How will the performance KPIs cascade down to the individual contributor from the CEO’s goals? Easy ‚Äď take this example:

CEO sets a goal of wanting to increase revenue by growing the Sales line of business, specifically new customer sales. He sets a goal of 35% growth of new sales revenue, which the VP of Sales is tasked to drive. They , in turn, assign the goal to the account managers in charge of new customer accounts who then add the same goal to their salesforce in the field. The KPI becomes New Sales Growth >= 35%, frequency is set to weekly with hierarchical rollups to monthly and quarterly aggregations.

 

–Now, you may ask yourself, what about the Operations department where Customer Sales and Service (aka Telesales) lives, and bingo! You’re getting it now‚ĶEven though it was tasked to the VP of Sales, they, or that same CEO, should have realized that the Telesales department can also generate revenue from new sales, just those that come through a different channel. Instead of the typical route of calculating in field sales and measuring the sales department for a goal of this nature, the Operations VP should have the same goal on their performance review as the VP of Sales, which they, in turn, delegate to their Telesales Center Managers, who delegate to the Supervisors who delegate it to the agents on the phone ‚Äď While it is an implicit delegation as one who is hired to man a Telesales line understands their job is to answer the phone and make sales (thus, encouraging sales growth), it is still an action that is mandated by a supervisor, who received the mandate from their manager, who likely received the goal from the corporate office VP of Operations or person responsible for the Telesales center.

–It flows from top or bottom (vertically) as well as horizontally since in this example, it covers two horizontal business units (Sales and Operations).

–This is why starting with objectives or goals makes this process, that is, cascading KPIs, that much easier because you have a definitive starting point and end point which is that same objective/goal.

 

New Sales Growth >= 35% is the same whether you start with the call center agent whose awesome sales performance on the phones contributes to making her supervisor meet their goal which was to grow sales by 35% who enabled their manager who enabled the VP of Operations and the VP of Sales objectives who then met or exceeded their CEO’s original mandate.

 

Service Component Reference Model (SRM) ‚ÄĘ Service domains, service types ‚ÄĘ Business and service components

–A service component is defined as "a self contained business process or service with predetermined functionality that may be exposed through a business or technology interface.

 

Data Reference Model (DRM) ‚ÄĘ Business-focused data standardization ‚ÄĘ Cross-agency information exchanges

 

Technical Reference Model (TRM) ‚ÄĘ Service component interfaces, interoperability ‚ÄĘ Technologies, recommendations

Today I dub ‘Data Services Oriented Architecture’ for a Web 2.0 and beyond World

As David Besemer wrote in his May 2007 article for DMReview, ‘SOA for BI’, "It took Michelangelo nearly five years to complete his famous works at the Sistine Chapel. Your transition to SOA for BI can go much faster if you start with data services."

What are data services? According to Wikipedia, wait, there isnt an existing definition on Wikipedia. First, a definition with I share with the Internet users of the world vis-a-vie WikiPedia:

"A Data Services Oriented Architecture or ‘DSOA’ framework consists of a combination of schemas, classes and libraries that facilitate and provide the ability to create and consume data services for the web. DSOA reveals the consumer data underlying architecture, exposed using Data Services’ Entity Data Model, and provides reusability of your service when developed correctly," Laura Edell-Gibbons, Mantis Technology Group Inc.


portalnavbar
And by correctly, I would highly recommend not getting bogged down by the concept of plug and chug, dubbed by my colleague Tip, or making your code reusable. It is all a balance, remember, young BI padawan.

Select best-in-class data services middleware to help you model, develop and implement your back-end BI services. PowerDesigner is a pretty rocking modeling tool, which covers everything from data element impact analysis to facilitating requirements gathering. Simplistically speaking, I am a big fan of the simplicity of SQL Server Integration Services and the new Data Services, both Microsoft products, though this opinion is certainly one that doesnt necessarily represent the populas vote. I am a big fan of Infomatica and Data Integrator (now called Data services, funnily enough under the SAP/BusinessObjects brand).

During the first and second projects, be sure to track all productive working hours to deliver each phase of your solution and costs savings for the efficiencies I expect you designed your system with the unescapable expectation of being the ROI-generator, a widely accepted expectation that all BI systems have high ROI, and many due. Start small, grow enterprise once the concept has been leaned out and efficiencies expected and beyond are gained. Then, as you expand your deployment from project to enterprise, you can easily self-fund additional licenses and other required resources with the savings or other benefits gained on those 1st two, somewhat painful, ‘initiation’ projects. We all have to go through the process and while painful at times, the learning experiences offered outweigh any of the difficulties while learning.

It is better to build the new services project by project, always making the predecessor available to other projects in a unified data services tier as you go. You and your team can then choose whether to rhttp://scorecardstreet.spaces.live.com/mmm2008-11-07_18.20/#euse a data service, extend an existing service, buy or to build something from scratch, my least favorite BTW.

Over time, these will change and I suspect ‘reuse’ will become the greatest portion of the proverbial pie, whereas today, I believe the paradigm shifts more in the direction of ‘build from scratch.’

Starting to plan front-end BI services up front, even while deploying your backend BI services, will enable you to make small but meaningful steps without much noticable downtime to the organization, something especially important for those of us working with a ‘4 x 9s’ uptime SLA for our data centers. Plus, remember, if you build these on a powerful data services foundation, you will reduce your time to market, and your TCO over time. By providing the business with their much anticipated and needed operational reports, tactical and strategic dashboards and performance management analytics while infusing the lot into your SOA, one will reep rewards greater than my words could ever portray, dear reader…Til then, remember ‘what will come sooner than you think is no more when than how’.

References:

  • David Besener. "SOA for BI." DMReview, May 2007.
  • SWF Search-ability Announcement from Adobe and How It Relates to Xcelsius 2008

    Imagine Xcelsius dashboards especially built in 2008 with its flexible add-on component manager making it that much easier to customize components (think objects / widgets like scatterplots which are offered out of the box as a chart type)..

     

    Now, we have a best practice for monetizing the SWF content that is part of your Xcelsius 2008 dashboard…here is what Adobe had to say:

     

    Adobe is teaming up with search industry leaders to dramatically improve search results of dynamic web content and rich Internet applications (RIAs). Adobe is providing optimized Adobe Flash Player technology to Google and Yahoo! to enhance search engine indexing of the Flash file format (SWF) and uncover information that is currently undiscoverable by search engines. This will provide more relevant automatic search rankings of the millions of RIAs and other dynamic content that run in Adobe Flash Player. Moving forward, RIA developers and rich web content producers won’t need to amend existing and future content to make it searchable‚ÄĒthey can now be confident that it can be found by users around the globe.

    Why is this news important?

    Adobe is working with Google and Yahoo! to enable one of the largest fundamental improvements in web search results by making the Flash file format (SWF) a first-class citizen in searchable web content. This will increase the accuracy of web search results by enabling top search engines to understand what’s inside of RIAs and other rich web content created with Adobe Flash technology and add that relevance back to the HTML page.

    Improved search of SWF content will provide immediate benefits to companies leveraging Adobe Flash software. Without additional changes to content, developers can continue to provide experiences that are possible only with Adobe Flash technology without the trade-off of a loss in search indexing. It will also positively affect the Search Engine Optimization community, which will develop best practices for building content and RIAs utilizing Adobe Flash technologies, and enhance the ability to find and monetize SWF content.

    Why is Adobe doing this?

    The openly published SWF specification describes the file format used to deliver rich applications and interactive content via Adobe Flash Player, which is installed on more than 98 percent of Internet-connected computers. Although search engines already index static text and links within SWF files, RIAs and dynamic web content have been generally difficult to fully expose to search engines because of their changing states‚ÄĒa problem also inherent in other RIA technologies.

    Until now it has been extremely challenging to search the millions of RIAs and dynamic content on the web, so we are leading the charge in improving search of content that runs in Adobe Flash Player. We are initially working with Google and Yahoo! to significantly improve search of this rich content on the web, and we intend to broaden the availability of this capability to benefit all content publishers, developers, and end users.

    Which versions of the SWF file format will benefit from this improved indexing and searching?

    This solution works with all existing SWF content, across all versions of the SWF file format.

    What do content owners and developers need to do to their SWF content to benefit from improved search results?

    Content owners and developers do not have to do anything to the millions of deployed SWF files to make them more searchable. Existing SWF content is now searchable using Google search, and in the future Yahoo! Search, dramatically improving the relevance of RIAs and rich media experiences that run in Adobe Flash Player. As with HTML content, best practices will emerge over time for creating SWF content that is more optimized for search engine rankings.

    What technology has Adobe contributed to this effort?

    Adobe has provided Flash Player technology to Google and Yahoo! that allows their search spiders to navigate through a live SWF application as if they were virtual users. The Flash Player technology, optimized for search spiders, runs a SWF file similarly to how the file would run in Adobe Flash Player in the browser, yet it returns all of the text and links that occur at any state of the application back to the search spider, which then appears in search results to the end user.

    How are Google and Yahoo! using the Adobe Flash technology?

    Google is using the Adobe Flash Player technology now and Yahoo! also expects to deliver improved web search capabilities for SWF applications in a future update to Yahoo! Search. Google uses the Adobe Flash Player technology to run SWF content for their search engines to crawl and provide the logic that chooses how to walk through a SWF. All of the extracted information is indexed for relevance according to Google and Yahoo!’s algorithms. The end result is SWF content adding to the searchable information of the web page that hosts the SWF content, thus giving users more information from the web to search through.

    When will the improved SWF searching solutions go live?

    Google has already begun to roll out Adobe Flash Player technology incorporated into its search engine. With Adobe’s help, Google can now better read the SWF content on sites, which will help users find more relevant information when conducting searches. As a result, millions of pre-existing RIAs and dynamic web experiences that utilize Adobe Flash technology, including content that loads at runtime, are immediately searchable without the need for companies and developers to alter it. Yahoo! is committed to supporting webmaster needs with plans to support searchable SWF and is working with Adobe to determine the best possible implementation.

    How will this announcement benefit the average user/consumers?

    Consumers will use industry leading search engines, Google now and Yahoo! Search in the future, exactly as they do today. Indexed SWF files will add more data to what the search engine knows about the page in which it’s embedded, which will open up more relevant content to users, and could cause pages to appear at a higher ranking level in applicable search results. As a result, millions of pre-existing rich media experiences created with Adobe Flash technology will be immediately searchable without the need for companies and developers to alter content.

    When will the new results register on Google?

    Google is using the optimized Adobe Flash Player technology now, so users will immediately see improved search results. As Google spiders index more SWF content, search results will continue to get better.

    How will this announcement benefit SWF content producers?

    Organizations can now dramatically improve the rich web experiences they deliver to customers and partners by increasing the use of Adobe Flash technology, which is no longer impeding the ability for users to find those experiences in highly relevant search results. RIA creators and other web content producers can now be confident that their rich media and RIA experiences leveraging Adobe Flash technology are fully searchable by users around the globe who use the dominant search engines. Furthermore, the ability to index information extracted throughout the various states of dynamic SWF applications reduces the need to produce an HTML or XHTML backup for the RIA site as a workaround for prior search limitations.

    Does this affect the searchability of video that runs in Adobe Flash Player?

    This initial rollout is to improve the search of dynamic text and links in rich content created with Adobe Flash technology. A SWF that has both video and text may be more easily found by improved SWF search.

    Will Adobe Flex applications now be more easily found by Google search, including those that access remote data?

    Yes, any type of SWF content including Adobe Flex applications and SWF created by Adobe Flash authoring will benefit from improved indexing and search results. The improved SWF search also includes the capability to load and access remote data like XML calls and loaded SWFs.

    Does Adobe recommend a specific process for deep-linking into a SWF RIA?

    Deep-linking, in the case of SWF content and RIAs, is when there is a direct link to a specific state of the application or rich content. A variety of solutions exist today that can be used for deep-linking SWF content and RIAs. It’s important that sites make use of deep links so that links coming into a site will drive relevance to the specific parts of an application.

    To generate URLs at runtime that reflect the specific state of SWF content or RIA, developers can use Adobe Flex components that will update the location bar of a browser window with the information that is needed to reconstruct the state of the application.

    For complex sites that have a finite number of entry points, you can highlight the specific URLs to a search spider using techniques such as site map XML files. Even for sites that use a single SWF, you can create multiple HTML files that provide different variables to the SWF and start your application at the correct subsection. By creating multiple entry points, you can get the benefits of a site that is indexed as a suite of pages but still only need to manage one copy of your application. For more information on deep-linking best practices, visit www.sitemaps.org/faq.php.

    Is Adobe planning on providing this capability to other search vendors too?

    Adobe wants to help make all SWF content more easily searchable. As we roll out the solution with Google and Yahoo!, we are also exploring ways to make the technology more broadly available.

    Where to go from here

    For for more information from Google on SWF search, read Improved Flash indexing on the Official Google Webmaster Central Blog.

    .NET vs. Java Consumer SDK – BusinessObjects Enterprise

    The Java and .NET versions of the consumer SDK are identical in functionality. The two versions of the SDK are generated from a common set of Web Service Definition Language (WSDL) files. As a result, they possess identical class names and inheritance patterns. There are differences between the two, however, that are addressed in this section.

    Note:    For more information on the Platform Web Services WSDL, see Using the WSDL instead of the consumer API.

    Organization of plugin classes

    It is the goal of this SDK to provide the same organizational structure of plugin classes as provided in the traditional, non-web services Enterprise SDK.

    In Java, classes are organized in packages where the name of the plugin is part of the package. For example, the CrystalReport class is located in the com.businessobjects.enterprise.crystalreport package, while the Folder class is located in the com.businessobjects.enterprise.folder package.

    In .NET, classes are organized in namespaces based on its plugin type. There are separate namespaces for destination, authentication, desktop, and encylopedia plugin classes. For example, both the CrystalReport and Folder classes are desktop plugins, so they are located in the BusinessObjects.DSWS.BIPlatform.Desktop namespace.

    There is also a separate namespace for system rights in .NET.

    Representation of class properties

    WSDL class properties are generated differently in Java and .NET. In Java, properties are generated as getX and setX methods, where X is the name of the property. In .NET, properties are generated as fields.

    In this guide, the term "property" refers to both the class method in Java and its field equivalent in .NET.

    Capitalization of method names

    In Java, method names begin with a lowercase character. In .NET, method names begin with an uppercase character.

    In this guide, the convention is to refer to a method by its Java case.


    Graphical Representation of the Process Towards Business Intelligence Enlightenment

    It all begins with an idea…whether an idea that came to you or one which is derived on a senior executive’s whim, all business intelligence initiatives start with a single thought: how can I drive more data into our decision making and business processes in order to drive better more accurate decisions for the business, thus enabling world class operations and growth potential. Whew – that was a mouthful.
    In all reality, let this graphical representation flow as organically as the thought I am trying to emphasize here – BI is a thought process and is a human relative need – So, we, as technologists need to start building software applications that meet that germanely simple conceptual need – to create software that not only improves my efficiencies at home or at work but that marries those efficiencies into human adaptive and behavioral neurological processes. When synapsis’ fire in ones brain, and neurological circuitry moves to pass one synapsis into another cortex from an origination point, one can visualize how to tie this metaphor into the systems we use everyday – Take the process of searching the Internet using your favorite search engine. We enter key words or metadata tags that represent nouns, verbs or contextual fragments that represent natural neural processing of the human brain. If search engines were constructed, likewise, BI systems, to more mirror this reflexive neural network within their enterprise application architecture, one might find more usefulness in the long run in terms of end user adoption and sustainment of said adoption after the 1st year after implementation. Let this pictorial represent that behavior marriage between BI technologies and human neural networks.

    Increasing Business Value vs. Insights Provided – a Business Intelligence Roadmap

     

    While many companies feel they have strong BI programs, most, in my experience, have operational reporting systems and sometimes, if you are lucky, they also have strategies that go with those systems or even better, are driving those systems (fueled by requirements gleaned from business needs and actual usage scenarios vs. the "way it has always been done/reported on").

    As you can see in figure 1, that level merely tells you the ‘WHAT’ – it doesn’t answer the ‘WHY’ it happened (root cause), or predict the ‘HOW’ it might affect you in future, nor the ‘WHEN’ in terms of monitoring if it is happening currently or just a past event.

     (img source TDWI Research at http://tdwi.org, 2007)

    Your BI roadmap should have a similar long term plan. If you want to provide increasing value to your organization, one must get out of the business of operational reporting and move towards the real meat and value of BI, which lay in the analysis, monitoring and predicting in terms of how the business views their needs from BI, not how BI believes they can deliver information.

    Guerilla Marketing Meet Business Intelligence; The Result? Social BI Networks or BI 2.0

    Casual users who become adopters compound the ‚Äúviral‚ÄĚ tipping point of the ‚Äúusage effect‚ÄĚ.¬† When a system eases the user‚Äôs experience, rather than the reverse that never leads nowhere good <insert the person who best fits ‚ÄėI deleted it from the registry with a shrug‚Äô who inadvertently deletes their O/S here>, why not try to these stages to gain a better understanding of your BI efficacy from your users.

     

    I dubbed this measurement technique 3XEI (in order) : educate, illustrate, elaborate, integrate, extrapolate, iterate. The follow down process is as defined as follows:

    1. Seek first to understand, then to be understood; EDUCATE yourself on the pain points from the eyes of the customer (‚ÄúVoice of the Customer‚ÄĚ),
    2. ILLUSTRATE to prior adopters, detractors and neutralities the benefits of BI by mapping out or illustrating a visualization map of their current process circling all pain points in the process.
    3. In order your dependence on IT to support business-driven application acquisitions, you should continue to interview candidates, walk the process and mine as much as you can in order to ELABORATE on the process map until you’ve captured all process metrics.
    4. INTEGRATE your approach to educating, illustrating and elaborating on your BI program, you will gain end-to-end insights previously obscured by not factoring in the multi co-linearity of inputs on each other as well as the intended output.
    5. EXTRAPOLATE to pain points and use information to drive decisions or changes to the process at large.
    6. ITERATE on the illustrated processes as extrapolation exercise dictates or requires in order to remain vital and relevant to the organization.

    These phases have no definitive start/stop points. They are as mutable as the changing technology landscape in corporations today. They are tightly embedded within for the life of the process at the broad organizational level. Very few companies understand the power of mapping all efforts to the core business process supported, nor can they visualize the power of the optimized, end-to-end process on driving the bottom line. Those that do, reap the rewards; and those that don’t, well, they need not be mentioned.

    PerformancePoint Planning Server Configuration Notes for MCTS 70-556 Certification Exam

      


    Topics: Models and Dimensions – Day 1

    (see powerpoint reference below for notes from day 1) 


    Dimensions – Day 1

    Once you have your model, you have a cube within Analysis Services to analyze. Then you can slice your dimensions to get you the particular leaf level of information you need:

    These correspond to your Excel PPS add-in report creation wizard options:

     

     


    Rules – Day 2

    Where the magic is for getting those pesky financial ratios calculated correctly (outside of the cube) is in the Business Rules section, found by clicking on a model name and clicking on the 3rd tab (see image below).

    PerformancePoint Planning Server (PPS Planning) uses PEL expression syntax as the PPS native query language — On the up side, PEL can be converted to both Native SQL and Native MDX, which is a nice feature!

       

       

    3 Types of Rules


    1) ASSIGNMENT RULES – can copy and paste debug code that is output by PPS PEL into SQL Server Analysis Services’ (SSAS) 2005 Management Studio client tool or SSAS 2000 MDX Sample App client tool for assignment rules ONLY¬†

    Example assignment type business rule – real-world PEL expression:

    –comment: all rules are nested within scope (); scope statements that are set explicitly or variably within the embedded ‘THIS’ statement below.

    SCOPE(

    [Scenario].[All Members].[Forecast],

    [Time].[YQM].[July FY2008]:[Time].[YQM].[December FY2008],

    [Account].[BizCorpAccount].[20010], — external sales

    [BusinessProcess].[Standard].[input],

    [Entity].[BizCorpEntity].[All].leafmembers,

    [TimeDataView].[All Members].[PERIODIC],

    –[Currency].[All Members].[USD],

    [Currency].[All Members].[Total All Currencies].LeafMembers,

    [InterCompany].[All Members].[None],

    [Geography].[BizCorpGeo].[ALL].leafmembers,

    –[Geography].[BizCorpGeo].[NAM],

    [Products].[BizCorpProducts].[All].LeafMembers,

    [Version].[All Members].[Current]) ;


    THIS = ([Account].[BizCorpAccount].[35300] * [Account].[BizCorpAccount].[35200]);

    END SCOPE;

    Here is the native MDX code for the above assignment rule to help bridge the gap between PEL and what you might be more used to when writing against multi-dimensional sources (assuming those interested in this section on MDX are OLAP developers and architects):

    WITH CELL CALCULATION queryCalc


    FOR


    ‘([Measures].[Value],


    [Account].[BizCorpAccount].[Level 11].&[5001],


    [Scenario].[All Members].[Scenario].&[9],


        {[Time].[YQM].[Month].&[200807] : [Time].[YQM].[Month].&[200812]},


        [BusinessProcess].[Standard].[Level 06].&[8],


        Descendants([Entity].[BizCorpEntity].[(All)].&[0], 1073741823, LEAVES),


        [TimeDataView].[All Members].[TimeDataView].&[1],


        Descendants([Currency].[All Members].[(All)].&[0], 1073741823, LEAVES),


        Descendants([Geography].[BizCorpGeo].[(All)].&[0], 1073741823, LEAVES),


        [InterCompany].[All Members].[InterCompany].&[-1],


        Descendants([Products].[BizCorpProducts].[(All)].&[0], 1073741823, LEAVES),


    ¬†¬†¬† [Version].[All Members].[Version].&[1])’


        AS


        (([Account].[BizCorpAccount].[Level 03].&[5090],[Measures].[Value]) * ([Account].[BizCorpAccount].[Level 03].&[5089],[Measures].[Value]))


        SELECT NON EMPTY


        ([Measures].[Value],


        [Account].[BizCorpAccount].[Level 11].&[5001],


        NonEmpty(({[Scenario].[All Members].[Scenario].&[9]},


        {{[Time].[YQM].[Month].&[200807] : [Time].[YQM].[Month].&[200812]}},


        {[BusinessProcess].[Standard].[Level 06].&[8]},


        {Descendants([Entity].[BizCorpEntity].[(All)].&[0], 1073741823, LEAVES)},


        {[TimeDataView].[All Members].[TimeDataView].&[1]},


        {Descendants([Currency].[All Members].[(All)].&[0], 1073741823, LEAVES)},


        {Descendants([Geography].[BizCorpGeo].[(All)].&[0], 1073741823, LEAVES)},

     
      {[InterCompany].[All Members].[InterCompany].&[-1]},


        {Descendants([Products].[BizCorpProducts].[(All)].&[0], 1073741823, LEAVES)},

     
        {[Version].[All Members].[Version].&[1]})))


        properties [Scenario].[All Members].Key ,


        [Time].[YQM].Key ,


        [Account].[BizCorpAccount].Key ,


        [BusinessProcess].[Standard].Key ,


        [Entity].[BizCorpEntity].Key ,


        [TimeDataView].[All Members].Key ,


        [Currency].[All Members].Key ,


        [Geography].[BizCorpGeo].Key ,

     
      [InterCompany].[All Members].Key ,

      
    [Products].[BizCorpProducts].Key ,


     [Version].[All Members].Key

      
    ON COLUMNS

      
    FROM [Forecast]


    2) Definition RULES – same as assignment except calculated at run time and is not stored within the cube.

    Sample expression for a definition rule for calculating variance to budget:

    scope (

    [TimeDataView].[All Members].[PERIODIC],

    [BusinessProcess].[Standard].[INPUT],

    [Account].[BizCorpAccount].[20320].leafmembers,

    [Scenario].[All Members].[VarBudget_pct]) ;

    This=(([Scenario].[All Members].[Budget]-[Scenario].[All Members].[Actuals])

    /

    [Scenario].[All Members].[Budget]

    );

    end scope;

    3) INTERCOMPANY RECONCILE RULE Рwhen one department is the receivable and the other is a payable Рthis rule balances your balance sheets

       



    Report Properties – Day 3

    Properties within Reports:

    • Under options:
      • Capture changes before workflow action(writes to SQL before workflow action – false by default, and best practice due to amount of data sent).
    • Capture design time formulas – allows for writable cells in Excel reports – default is true
    • Inherit design time formulas – on the excel side – true by default – can slow things down‚ĶPut formula at the All members so that the leafs inherit the formulas
    • Options – should use NO spread for most instances

            

    Security within PPS – Day 3

    Windows AD or LDAP only

    If you have write access but NOT read, you have nothing РREAD MUST BE IN THERE 

     

    Users can be assigned to many roles

    If READ ONLY, at the ALL MEMBERS

    IF WRITE, at the leaf level

    Use same¬†service account as SSIS¬† (or other ETL tools) instead of PPS service accounts –¬†ensure your login not associated as an owner;¬†¬†¬†¬†

    Should be the same account as the SSIS package "Data import" service account Рit will pass through the credential details of the account; not the same account as the ADMIN account or anything requiring ADMIN rites on SSAS…

       

        



    Cycles, Jobs and Assignments (work flow) all found under Process Management Link


       

    Cycles 

    A cycle is a time period is for 1 scenario (Actual, Budget, Forecast) and 1 time range

        Check Define a recurrence for forecast if it happens every year

       

    • Dates start and end are for non recurrence entered next screen – first and last days to input in the forecast before close out
      • For recurring, select App Calendar, the period of recurrence (Month is typical)
      • For each repetition of the forecast, enter the time here for when assignments can be made and forecast can be edited. This is th e ‘open’ books period.

       

    Summary slide for assigning cycles

       

            

    Remember: Forecast cycles – outside container of time for all activities

    Only 1 Cycle is 1 scenario x 1 model x 1 range of time

               

     


    Workflow / Assignments: see powerpoint files (please email me if you want these)

       

    Assigned to user in Excel – submit and submit draft actually hit db; otherwise, local saved only

       

    Once submitted, it is closed!
       

    Status of submission:

    Pending status first – waiting in queue to be picked up by PPS service

    Wait Process – services picked it up
    Partial / Submitted РData is in the Fact tables 

     


    Associations

    Between models, you can associate dimensions – especially important between forecast and corporate models when different time scenarios are used, or different accounts from the general ledger are part of the cube requirements, something harder than usual for an OLAP developer to get right. (the 1st test pass at least)


    Appendix

    Please email me for the actual files from the list of options below…lauragibbons@scorecardstreet.com.


     

       

    Remember…

    1. 2 keys with your dimensions1) to check-in your model before you proceed to any of the offered ‘Available Actions’ and 2) you deploy and redeploy model after changes are made to the structure where you want to refresh your cube (fact) data;

    2. 2 keys with your business ruleswhen making changes or adding new assignments &/or associations, PPS Planning requires you 1) refresh and reprocess model data and 2) redploy rules

    Data Model Standard Abbreviations

    Click on a Letter to link there: C D F H P T

    WORDS USED IN LOGICAL AND PHYSICAL MODELS

    ¥ РMandatory Abbreviation

    WORD LOGICAL PHYSICAL

    Academic academic ¥ acad

    Access access access

    Account account ¥ acct

    Acronym acronym ¥ acr

    Account title line account title line ¥ accttl

    Actual actual ¥ act

    Address address ¥ addr

    Administrative administrative ¥ admin

    Advisor advisor ¥ advr

    Amount ¥ amt ¥ amt

    Agreement agreement ¥ agree

    Appointment ¥ appt ¥ appt

    Area area area

    Award award ¥ awd

    Balance balance ¥ bal

    Base base base

    Basis basis basis

    Budget budget ¥ bdgt

    Beginning beginning ¥ beg

    Billet billet billet

    Birth birth birth

    C (top of page)

    WORD LOGICAL PHYSICAL

    Campus campus ¥ camp

    Category category ¥ cat

    Code code ¥ cd

    Chair chair chair

    CHRIS chris ¥ chris

    Citizenship citizenship ¥ citz

    Close close close

    Class (academic class) class ¥ cls

    Commitment commitment ¥ cmit

    Committee committee ¥ comm

    Completed completed ¥ compl

    Continuing continuing ¥ contg

    Conversion conversion ¥ conv

    Cost cost cost

    Coterm coterm coterm

    Council council council

    Cross cross cross

    Course course ¥ crse

    Current current ¥ cur

    D (top of page)

    WORD LOGICAL PHYSICAL

    description ¥ desc ¥ desc

    Degree degree ¥ deg

    Department ¥ dept ¥ dept

    Direct (vs. Indirect) direct ¥ dir

    Distribution distribution ¥ distrib

    Division division ¥ div

    Dollar dollar dollar

    Date date ¥ dt

    Dual dual dual

    Effective effective ¥ eff

    Emeritus emeritus emeritus

    End end ¥ end

    Ended ended ¥ end

    Ending ending ¥ end

    Enrolled enrolled ¥ enr

    Enrollment enrollment ¥ enr

    Earned earned ¥ ern

    Ethnic ethnic ethnic

    Expense, Expenditure ¥ exp ¥ exp

    Extension extension ¥ ext

    F (top of page)

    WORD LOGICAL PHYSICAL

    Faculty faculty ¥ fac

    Federal federal ¥ fed

    Female female ¥ fem

    Field field field

    First first first

    Fiscal fiscal ¥ fisc

    Foregone foregone foregone

    Foreign foreign foreign

    Fiscal to Date fiscal to date ¥ ftd

    Full Time Equivalent full time equivalent ¥ fte

    Full full full

    Function function ¥ func

    Fund fund ¥ fund

    Funds funds ¥ fund

    Funding funding ¥ fund

    Fund Title 2 fund title 2 ¥ fund2

    Fund Title 3 fund title 3 ¥ fund3

    Fund Title 4 fund title 4 ¥ fund4

    Fund Title 5 fund title 5 ¥ fund5

    General general ¥ gen

    Gender gender gender

    General Ledger ¥ general ledger ¥ gl

    Graduate ¥ grad ¥ grad

    Grading grading grading

    Grand grand grand

    Group group ¥ grp

    H (top of page)

    WORD LOGICAL PHYSICAL

    Hire hire hire

    Hierarchy hierarchy ¥ hrchy

    Indirect Cost indirect cost ¥ idc

    Identifier ¥ id ¥ id

    Indicator ¥ ind ¥ ind

    Institute institute ¥ inst

    Instructor instructor ¥ instr

    Instruction instruction ¥ instn

    Intended intended ¥ intd

    Investigator investigator ¥ invstr

    Image image ¥ img

    Job Classification Code ¥ jcc ¥ jcc

    Leave leave leave

    Level level level

    Location location ¥ loc

    Major major major

    Male male male

    Month month ¥ mon

    Modified Total Direct Cost ¥ mtdc ¥ mtdc

    Name name ¥ nm

    NIH (National Institute of Health) ¥ nih ¥ nih

    No no no

    Non non non

    NSI (Network for Student Information) ¥ nsi ¥ nsi

    Null null null

    Number number ¥ num

    Operating Budget operating budget ¥ ob

    Off off off

    Open open open

    Organization ¥ org ¥ org

    Original original orig

    Other Sponsored Accounts ¥ osa ¥ osa

    Other other ¥ oth

    P (top of page)

    WORD LOGICAL PHYSICAL

    Parent parent ¥ par

    Percent percent ¥ pct

    Person person person

    Personal personal person

    Program program ¥ pgm

    Principle Investigator ¥ pi ¥ PI

    Plan plan plan

    Prime prime prime

    Project project ¥ prj

    Proposal proposal ¥ prop

    Proposed proposed ¥ prop

    Project to Date project to date ¥ ptd

    Quarter ¥ qtr ¥ qtr

    Quarter Year ¥ qyy ¥ qyy

    Quantity quantity ¥ qty

    Rank rank rank

    Rate rate ¥ rt

    Registered, Registration ¥ reg ¥ reg

    Rejected rejected ¥ rej

    Residence residence ¥ res

    Residency residency ¥ res

    Restrict restrict restrict

    Return return return

    Subject Area subject area ¥ sa

    Salary salary ¥ sal

    School school school

    SUFIN ¥ sufin ¥ sfn

    Share share share

    Short short short

    Sound sound ¥ snd

    Social Security Number ¥ ssn ¥ ssn

    Special special ¥ spc

    Sponsored Project or Office ¥ spo ¥ spo

    Sponsor sponsor ¥ spon

    Source source ¥ src

    Start start start

    Status status ¥ stat

    Stipend stipend ¥ stip

    Stop stop stop

    Student student ¥ stu

    Stanford University ¥ su ¥ su

    Subschool subschool subschool

    T (top of page)

    WORD LOGICAL PHYSICAL

    Tenure tenure tenure

    Term term term

    Time time ¥ tm

    Title title ¥ ttl

    Total total ¥ tot

    Transaction ¥ tran ¥ tran

    Transfer transfer ¥ xfer

    Tuition tuition ¥ tuit

    Text text ¥ txt

    Type type ¥ typ

    Undergraduate ¥ ug ¥ ug

    Unit unit unit

    University ¥ univ ¥ univ

    Waiver waiver ¥ waiv

    Withdrawn withdrawn ¥ wdn

    Word word word

    Transfer transfer ¥ xfer

    Dis-parate or disparate – you decide…

    I break the same word into two particular meanings when I emphasis a syllable. And in the near term, the most obvious misused definitions tend to fall into a word I take completely for granted, disparate. Now I pronounce it as though it were spelled disperite, a 3 syllable, sing-song pronunciation. Dis-parate, on the other hand. or dis-parrot, as it should be spelled, refer to the sheep that tend to follow the loudest mouth in¬† data warehouse planning room. Separate thoughts become lost in lieu of taking on a consensus, or sheep-like, group-think behavior. And because these same practitioners tend to think very highly of themselves, one is left wondering exactly why individuals cannot express what they are really thinking more often…Fear of others’ judgements, fear of being wrong…? Who knows…

    Instantiating better views of data

    In support of a better approach to merging Six Sigma with business intelligence practices, I wondered why such a large subset of the population of folks I have asked have believed the 2 to have distinct functions with no "transparent" overlap…Coupled with being somewhat non-objective because of my strong believes of there existing natural overlap and tendencies towards merge as we extend beyond traditional verticals…And while there are some who would call on the use of BI in business process managed organizations, I am most impressed by the manufacturing arena,¬†in all of its data glory — with its¬†traditional stereotypical data-sets, like Cpk and Ppk, gap analysis and run charts, usually compiled using applications like MS Excel, manufacturing companies see the operational world through the integration of process and data to form a perfect marriage of information driven decision and future innovation. Stratifying data into canonical dimensions for use in¬†Pareto (80/20) charts seems like an integrated part of the day to day operations, in line with¬† financial operations or product engineering –
    Enough said…
    Here’s where Six Sigma fits in…Transitive property…remember¬†that timeless classic from our mathematics classes or yesteryear? when A = B and B = C, then A = C?¬†In the same vein, if Six Sigma was spawned out of manufacturing plants who for lack of a better BI solution,¬†began to depend on MS Office databases and reporting tools, or a form of¬†operational BI, than Six Sigma and BI¬†are virtually the same in their need for data and information.
     

    Process + People at the Right Place + Right Time = PM Success

    Launching PM systems is more than just technology…it is also process and people…

    With regards to the 2nd note regarding recruiting strategies, I
    wouldn’t limit myself to just ‘IT" managers. This is a misnomer as
    folks on the business/finance side of the house tend to understand PM /
    BI at it’s core, though they may call it something else like financial
    heartbeat reporting or by the product name like Hyperion Essbase
    reporting.

    At the end of the day, you want a true "data analyst/architect" type; a
    base knowledge of differing data architectures is extremely important,
    as is employing someone with the critical thinking skills to think
    outside the box. I know many an analyst that is far from what their
    title would imply; they might have labored in the passed to create
    Excel spreadsheets which may or may not be impressive to an end
    audience, only to have them fail when you ask them to construct an
    analysis plan from scratch. True analyst’s will be able to derive areas
    of opportunity by merely walking the halls and talking to folks
    indirectly; they will "walk a process" before trying to solve the
    challenge with business intelligence.

    Now, on to recommendations for a performance management approach:

    People do not know what they do not know it, until they know it. In
    addition, once they know what they need to know, be prepared that they
    will want to change and iterate upon it. .

    The oohs and ahhs of dynamic gauges and widgets never lasts that long
    if the system cannot respond to their needs in a timely fashion. We all
    have war stories of deployments gone wrong when the company didnt adopt
    it as we planned in our ROIs.

    This is a direct result of practitioners developing solutions in a
    broad sweep approach, whereas they try to boil the ocean upfront,
    promising all of the bells and whistles that vendors will sell you on,
    but not realizing that you have to approach PM and BI in terms of where
    you are trying to get to in your business, and how to use the 3 forms
    of BI to get there.

    There is no One Size Fits all approach. I would recommend
    Operational BI for your case, not to the exclusion of the others
    either. You need all 3 to be successful.

    First, understand the three types of BI: strategic, tactical and
    operational. Each one comes with an implicit frequency of data refresh
    requirement…

    Strategic BI is what many PM software packages focus on as they
    offer large return, however, they do not address the day to day
    decisions that need to be made outside of the C-levels.

    Tactical BI helps managers and department heads make weekly and
    monthly tactical decisions, such as how to allocate resources to meet
    the new strategic initiatives.

    Operational BI helps a much wider audience, from dispatchers and phone
    operators to hands on managers to executives – make thousands of
    operational decisions each day; unlike the other two, decisions need to
    be made quickly before a problem escalates beyond control. This bucket
    is where I THINK you should focus.

    While the relative value of the decision is smaller than the other
    two, the collective impact of those operational decisions is
    significant.

    WWW, “World Wide Web” or “Widget Wiki World”

    In a society more inclined to favor catchy phases and sing-song like slogans, a place, no, a world of growing re usability and modularized components (whether homes, or web based widgets), one has to wonder whether we are still in the "world wide web" phase or have we moved on, moved into a, wait, dare shall I say, Web 2.0 place?
    Knowing that we have, but have not yet "arrived", it seems to me to be the opportune time to suggest a change for the long-standing www, the all-loving acronym that appears before most domain names within your browsing lives (www.thinkyouarecool.com, and so forth)…Instead of the world wide web, which is not necessarily true to life, considering the restrictions placed on certain CNAMES or IP address/DNS patterns, I propose we shift gears into future, and dub it "Widget Wiki World" – widgets, for one, are used in everything from demonstrations to manufacturing plants, from internet based apps to thick client apps (do those still exist?); Wikis, ala Wikipedia, are a growing phenom, with far reaching purpose and adoption.

    And, yes, this is the world in which we have arrived. A world outlined in reusable objects that can be generated and produced to suit your fancy, deposited and stored wherever you like, and replicated with ease and simplicity (think Staples’ Red Easy button). Dorothy, I don’t think we’re in Kansas anymore – not with 2.0 (web, that is) having knocked down our doors…

    Enterprise 2.0, An Interesting Spin Off from Web 2.0 Concept, for Business Intelligence

    Enterprise 2.0.

    I thought I would follow up on my answer I added to¬†the following Linked in question…

    What is even more interesting is the linkage of Enterprise 2.0 and Business Intelligence – with the cost of tranditional BI packages on the rise, coupled with mergers of the Big 5 dotting the landscape, more and more open source providers in the BI space are starting to emerge, or gaining some traction.
    And it makes sense.
    What is the biggest complaint folks have today in the BI applications in their organizations? It is usually nice or something they preface to soften their real opinion, when in fact, they don’t use it or havent adopt fully, because, simply stated, the application hinders them from doing all of their day to day duties, even though it was positioned to save them days or weeks of human ETL processes with spreadmarts. Often, these overstated ROIs are never achieved, thus rendering the product as a capital expense in year 1 and an operational expense (maintainence) for subsequent years.
    WIth the open source arena, you dont have such herculean goals to achieve in order to prove value to an organization. You try before you buy, as there will always be some cost involved, whether at the infrastructure, db / data warehouse or application layer in order to really support both SOA and the newer open source technologies. But the future is near, and those grabbing on without letting go are the future players to watch out for; the are the ones adapting agile technologies to the BI solutions space, while maintaining constant commitment to never building what the customer requirements havent explicitly prioritized as important. Since BI has to have a broad base enterprise REACH (not solution), in order to build adoption, or encourage use from your C-levels, you can think of open source BI as business intelligence for Enterprise 2.0.

    2008 TDWI Executive Summit

    This past week was the TDWI Executive Summit for 2008, the event that kicks off the TDWI International Conference and Expo. Held¬†in Las Vegas at the Ceasar’s Palace, the event was marked by pointed speeches and presentations, deep-contextual level focus groups discussing challenges and next generation solutions within the business intelligence space.
     
    I sat on 2 panels centering around gathering requirements using an agile approach as well as on a panel discussing key performance indicators. Some of the highlights are as follows:
     
    University of Illinois,
    University of Georgia
    (for both driving business intelligence and performance management into the heart of their organizations)
     
    The country of Australia chapter of TDWI and CB, aka POJ, for driving direct and to the point over softening what could be a false statement meant to appease the other person, who also took the embedded pictures while I was spreaking on those panels,
     
    Research team at TDWI for making topics as relevant and as close to real life as possible.
     
    and of course, to the folks who did it back in 1997:
    Lawrence Livermore National Laboratory
     
     
    "A KPI is a series (5 or less) performance initiated, behavior changing metrics for measuring the heartbeat of one’s company…"
     
     
     
     
    ¬†A second point referencing the work I’d done creating the concept for, architecting, prototyping, and finally deploying¬†the Expedia balanced scorecard program and BI platform:
     
     
     
     

    Microsoft Announces the 2nd Annual Business Intelligence Conference – 2008 – Seattle, WA

    Annual event to be held in Seattle at the Washington State Convention & Trade Center from October 6 ‚Äď 8, 2008

     

    Thank you again for joining us for the inaugural Microsoft Business Intelligence Conference 2007, and for your hand in making it so successful! You are receiving this

    email because you opted in to receive information about the 2008 Conference during the 2007 event registration. We invite you to join us again in 2008 for the 2nd

    annual Microsoft Business Intelligence Conference 2008, which will showcase Microsoft’s market-leading BI products, solution expertise and customer successes.

     
    Thanks, in part, to you the first-ever Microsoft Business Intelligence Conference in May 2007 was an overwhelming success which drew over 2,800 total attendees from more than 65 countries, and resulted in great attendee and partner feedback! Plans are underway to ensure that the second-annual event is even more successful and we hope that you will save the date and plan to join us for the Conference in October.

     

    This exciting and informative conference will showcase Microsoft’s market-leading business intelligence (BI) products, solution expertise and customer successes. The

    event is designed to educate you, our customers and partners, on every aspect of Microsoft’s BI offering. You can also expect even more educational tracks and

    sessions by Microsoft product and industry experts, customer best practices sessions, the second-annual Microsoft BI Customer Awards presentation, hands-on labs,

    and much more! The Conference will be held at the Washington State Convention & Trade Center in the heart of downtown Seattle, Washington again in 2008.

     

    Gartner Releases 2007 Magic Quadrant for Data Warehouse & Database Management Systems

     
    My previous engagement had a propensity for buying every possible BI enterprise grade platform on the market, so I can attest to Microsoft, IBM and Teradata’s entry. My exposure to Oracle is minimal, but that which I am exposed (CDC), is really among the best out there. I am biased to Microsoft’s BI stack, though they recently slighted me regarding their Customer Conference in Sun Valley this year.
     
    Personal feelings aside,¬†for what you save in cost with the MS licensing and server fee, you make up for in prerequisite software requirements, implementation fees and consulting services (mainly the latter) because you have to forcibly¬†make it work the way you want it to or in the way you were sold it would work, yet were disappointed when it didn’t turn out to be the be-all end-all product you thought it would. ROI and BI is almost¬†a joke,¬†where the latter is starting to feel way too ERP-like or CRM-like for my taste. MS offers a familiarity that the others do not if¬†you are not accustomed to Crystal reports or MYSQL, because most of¬†us in¬†corporate America understand concepts like the "breadcrumb"¬†trail…But, that is a misnomer. Another urban legend sold by $160-200/hr consultancies while the product was still in beta. Buyer beware…Though participating in a Beta can be fun and lucrative, you will most likely get to know and love the Beta right before MS releases the product to market with a fraction of the features that you really cared about. To make matters worse, there is a lost cost in the customization of the Beta product when you have to invest a laborious amount of time uninstalling and reinstalling the components (in the correct order no less), often involving registry keys. DO I need to say more? I certainly wouldnt want my grandmother calling regedit32 – would you?
    Oh and did i mention that often those¬†custom apps that were built by the fancy shmancy consultant often break after they leave, and while most of us are used to the "maintenance contract" or when you have to pay more money to have them return to research and¬†fix what they built in the 1st place. I wouldnt mind paying for the fix, as organizations change metadata so often, that I can see how a custom built solution would need updates time to time. But to come back and research for a cost, shouldered by the client as well? Don’t I get fries with my combo meal?
     
    Before you get excited about the lower price point, remember to ask the consultant who you will inevitably need the following stack of questions:
     
    -How much is your maintenance contract, if not included in the price?
    -How many times on average did you have to return to a client to fix a solution you developed?
    -What was the average duration (in days) of those fixes?
    -Do you have any reduced rate maintenance plans if we contracted multiple years at once?
    -What happens if you cannot solve a feature set promised in the scoping (i.e. too difficult, can’t, won’t admit they can’t etc) or you don’t¬†get those expected results and feature sets as promised by the vendor product material? Will you comp that since we took a risk at your recommendation to work with a beta.
    (Get that in writing).
     
    Don’t get me wrong. I am a consultant too. However, the sweet spot for me if the many organizational hats I have worn./ BI, Scorecards, Dashboards, Six Sigma, Call Center, R & D,¬†Statistical Modeling, Planning, Analysis¬†and Budgeting are all areas I have worked in my career. By understanding both verticals and horizontal streams, one can scope and analyze the true nature of the problem being solved. SO many times, consultants, having more knowledge than yourself, (let’s face it, if they didnt, they wouldnt have been called in by your boss) or having the bandwidth that you dont (but mostly the former), so they assume they can tell you a technical solution, throw together a loose rendering of the star schema they plan to implement, possibly a data flow diagram if you are lucky and even more remotely, a security/auth logical flow model for your IT security ops team, so that you are wowed by the time they spent in Visio making the boxes "look pretty". Oh and slow down the pen from signing if they rendered the drawings in 3-D. I have seen more 3-D process maps and data flow models drive the pen to the ‘Sign here’ line, before the CIO really could take a look at the content.
    We all have to make decisions at a rapid pace. And believe me, I love our MS platform (PerformancePoint Server 2007) delivered via Sharepoint/MOSS 2007 to both internal and external customers/partners. But it came at a cost. A $1.5 million cost when all was said and done. And most of that (94%) was consulting services, which of that cost, nearly all but 1 scorecard was broken within the 1st and 2nd years post deployment.
    Not to mention to add. cost we had to spend on top of other add. costs from addendum statement’s of work for the documentation on how to use this customized non-functional thing.
     
    Moral of the story is do your homework. The right consultants are out there with experience in deploying performance management solutions. Don’t just go based on customer testimonials. Believe me, they really butter you up so that you will give a good reference. Ask the questions I listed, .GET THEM IN WRITING., and do your homework (read blogs, search the Internet), and if you are going to go by testimonials, dig into their maintenance and frequency of down time. We were unable to sustain the 90% up-time goals we had set in year 1 prior to paying more money to have them return to upgrade the work to a better, more stable architecture, blaming the database dev who initially scoped it out.
     
    If you get the right group of consultants, you will end up the winner with the MS toolset because of cost. If you have a limited budget,¬†go with MS, but only buy a component and it’s prereq. one at a time.¬†Get it to work. Train your end users and measure adoption. Then, add on to the stack by procuring other licenses.
     
    If you have a decent sized company or even a medium to larger size org, go with Business Objects Edge¬†(medium company¬†size)¬†or Enterprise (like the¬†name sounds size). And dont forget to add on the BO Voyager…It allows you to use their presentation tools on top of existing OLAP structures (Analysis Services cubes), so that you do not throw away previous work. Presentation wise, hands down, BO is the winner.
     
    The Performance Management suite (Analytics), and Dashboard tool (which is now part of the IDD product), are far more robust and rich with data visualization schema’s¬†out of the box. And the BO BI stack has just plain been doing it¬†for so much longer than MS, who really was one of the last big companies to delve into this long-outstanding technology sector. Props for what they have done. Just apply usage carefully. Build your roadmap and analyze whether you should stay with one stack or diversify. And keep a watchful eye on anyone building anything for you.
     
     
     
     
     
     

    Jill [or Jack] of all trades, Master of none‚ĶIn the Business Intelligence context, which is better? You decide…

    So, all of the holiday festivities are done, and I start my new job in 5 days. After a much needed sojourn and period of deep self-reflection, I am back, and with a fresh perspective. Me, being me, my "ah-ha" of awareness centered deeply in the power of scorecards. Whether it is an activity you are doing to monitor the health of your business, or something you do for fun¬†(albeit, without recognizing that you are doing it) like golf,¬† the act of scoring something, anything really,¬†is an activity as old¬†as time.¬†Scoring¬†is the¬†physical¬†representation of putting pen to paper, or more aptly stated, represents the recording of live scores/metrics/key performance indicators¬†and delivering said metrics in a way that makes sense to the¬†scorer. Often.¬†the scorer can clearly articulate the method¬†to their "scoring madness", but upon displaying the scorecard to¬†onlookers, will most noteably¬†be asked to translate the meaning of the very¬†scorer perfected personalized means of recording new entries. "Oh, that is my symbol for the bogey I landed at hole¬†8; or¬†Red means it is generally under forecast for revenue but for satisfaction, it means we were within a range that was under realized…" Huh? And typically, the scorer, in noticing the look of confusion spread across your face, or in responding to your question for further contextual information,¬†will¬†explain [at¬†length] their scoring¬†methodology, often times losing the interest of the listener. Come on folks, even I, who was quoted as "getting personally excited by [the program] Excel,"¬†can get bored; after all, for all the excitement that I find in talking turkey with BI industry folks, it truly isn’t the most exciting space in the world. It isn’t like announcing "I’m a rocket scientist" or I’m a CIA operative"; "I’m a business intelligence consultant" just isn’t as shiny on the surface.
     
    Oh but it is…The typical devil’s advocate, I see BI practitioners as some of the most cunning problem solvers¬†several best practices and of course, experience, to build an analytical web of cause and effect to drill down into architectural infrastructures at the core of any BI solution. The 5 "whys" of Lean Six Sigma stipulates asking 5 "Whys" to any question when you are trying to solve a problem. Here is an illustration of a typical personal goods retail store —¬†The sales manager asks his analyst for a report showing the winter sales actuals since he knows from¬†his staff meeting that his team missed forecast by a substantial amount.¬†¬†Director 1 says to manager X "I believe we didn’t have the inventory on the shelves to support the winter marketing campaign," which he surmises from his pass-through of the store during the Christmas rush. "I overheard two customers complaining about our practices of offering promotions¬†without stocking up accordingly." These practices create a sense of panic induced¬†scarcity as customers start buying the products fueled by a sense of not being able to get it again anywhere else. To the customer, it merely creates feelings of annoyance¬†and causes a buyer nightmare, as shoppers try to get one arm up on their fellow shopper [think 2007 Christmas gift nightmare trying to acquire a Nintendo Wii].
     
    As the manager, he asks you to put together some slides to show the leadership team addressing the potential causes, and added he was interested in your opinions for mitigating this risk in the future. He adds as he is leaving your office that his strong preference is for the manager to start with his hypothesis, mind you baked in nothing but a "gut feel" and an attempt to spy on customers, and let the data found there drive the other causal paths. 
     
    Ask yourself as the manager, where do I begin? What do I glean from the problem: "We didn’t hit forecast¬†for winter sale." Secondarily, you were told¬†"we didn’t have the inventory on the shelves to support the winter marketing campaign."
     
    Well on the surface, manager X knows [for a fact]…
     
    • –It was Q4 2007 (winter campaign)
    • –Forecasted numbers are generated by your team
    • –Marketing team also affected to some extent (marketing campaign part of director’s hypothesis as a cause, but for now you know during that period in question, there was a campaign also happening. Remember, ‘correlation isn’t causation’.
    • –You definitely know you didn’t ask enough of the right questions, since pondering the factual points leads you to wonder why the director believes it is related to inventory shortages; questions like "did he see some report outlining sales for the period…by category of retail item…did we hit targets for any product, which was buried in the shortages by other products…"

    Assumptions to ponder in your research:

     

    • –Was the marketing campaign related and how it is related?
    • –What items weren’t on the shelve by a specific set of date-related stratification factors-during the after Thanksgiving sales, the week before Christmas, the day¬†after Christmas, New Year’s Day, etc.
    • Extent of shortages was great enough that customers were complaining out¬†loud,¬†implying this happens all of the time [a point you are convinced is wrong considering you are responsible for forecasting sales and reporting to leadership how close/far you came to hitting store targets, and would have noticed this problem in the data].

    As the manager, you immediately log into your instant messaging service, and see your best analyst is online. You call her into your office along with your assistant to have a debrief meeting. Analyst I is told the situation, and ponders the problem for a moment. As the manager rattles off the director’s hypothesis, the analyst thinks¬†about the "5 Whys" of Lean zoning in on the 2nd part of the¬†problem statement "we didn’t have the inventory on the shelves to support the winter marketing campaign."

    Why 1: Why does the director believe we¬†didn’t have the inventory as marketed in our winter market campaign..? Spawns you to pull the list of items from the campaign and the related sales for the time period into a dataset. The results show you that 2 items of the 5 had less sales by a substantial margin that the other 3 items. In fact, the other 3 items hit target in terms of transactions sold and revenue generated, but why…

    Why 2: Why did the other 2 items, which had revenue shortfalls in the double digits, look to have sold enough units to not raise flags from what was forecasted?

    So, you pull up the ordering list and compare the items ordered to what was forecasted and see everything in check. You notice that you issued an unusually high number of rain checks for the 2 items, which you safely note in the back of your head to research later.

    You query¬†the¬†database once again and generated an inventory report from your central¬†warehouse¬†and nothing is out of the ordinary – they supplied what was ordered. You also noticed, however, that what was ordered was the same volume as what was ordered the previous month. This doesn’t pass your sniff test…How is it possible that the month prior to a huge marketing campaign and sales month were forecasted to be the same?

    You cross check the list of items shipped to the store, and everything was accounted for¬†by receivables. Upon further review, you leave this train of thought without noticing the slight change¬†in numbers that your query returned vs.¬†what was¬†in the report from your boss. The report came from the powers that be, so you didn’t think to question where the numbers came from; nor do you now, but I wanted to tickle you with that thought for when this applies to your real world situation of things to think about –¬†the effects of dirty data on quantification of business intelligence gaps / problems. But I digress…Back to our shared train of thought…

    So, if we ordered what we projected we would need which¬†was supplied¬†without any replenishment challenges, why 3 enters the scene…¬†

    Why 3 : Why, if the supply chain process worked as expected, did we issue an unusually high number of rain checks of the items advertised in our marketing campaign during what we consider to be our highest retail sales period in the year?

    While pondering this question, you remember and ask yourself Why 4:

    Why 4:¬† Why was¬†the warehouse overstocked in the two items that showed revenue shortfalls more than any other, prompting you to open a spreadsheet. After pasting the results of your query into a newly labeled¬†column A, row 1¬†‘Forecast from System 1’; in column B, you enter ‘Forecast from Leadership Report’,¬†in column C, you enter Time Period, and in column D you enter the name Variance. The save the document as ‘Multiple Forecast Systems Impact and Gap Analysis’¬†and pull the order form again. You¬†notice the¬†authorization signature was the head of marketing and not your boss, as expected.¬†After¬†surveying some¬†of the employees over in marketing you map a¬†secondary process for quarterly seasonal marketing campaigns since it involves a slightly different process, as you discovered. Since the process is only different in terms of which department submits the order to the warehouse, the analyst adds this as column E, titled Authorized By.

    The results of the data transformation paints an interesting picture. In every case of a forecast shortfall or overage, the authorized was the head of marketing and it was for a marketing campaign. Knowing this person to be very good about not making false claims about the health of their company, he wasn’t overly swayed that this was the silver bullet, by any means. This became¬†a bone on a subsequent¬†fish bone¬†diagram and¬†Cause and Effect matrix, two effective tools when using the 5 Whys to solve a question.¬†Unlike other periods in which your team of analysts produces the forecast by running the root queries directly against the data warehouse system, the marketing team using a decision support system that the BI team built to deliver reports to the execs on demand.

    Looking at the 2 sources of data columns, it was evident that the marketing team always used the reporting platform, whereas we were running adhoc queries directly against the DW. When our team was the approver of the RO, the data source was the DW. When the marketing team head was the authorizer, they used the reporting system.  The last of the 5 Whys РWhy did the numbers used by marketing vary so substantially from the numbers we pulled in terms of the expected forecast for campaign periods if the sources of data were ultimately generated by the same team?

    And the answer started to reveal itself…The numbers from the table compiled above¬†were different in every period involving campaigns which were ordered by the marketing team and they used the reporting platforms in order to generate forecasts of inventory to order. Our analysts now exploring the 5 Whys¬†is the author of the reports¬†so the assumption was they were accurate. The few times they were out of sync, pre-set alarms went off¬†alerting analysts before the impact worsened¬†by the exponential nature of time. The variances were there and were high enough in which I questioned Why the thresholds¬†were set so low, thus escaping notice¬†

    Believing the assumption that the forecasts were the same and thus, didn’t need to be compared to one another¬†was a simple mistake to make, and to learn to see before substantial financial loss occurs. Remember, supposed to be ranks up there with should’a, would’a and could’a when it comes to multiple data reporting sources. In time,¬†all systems can become out of sync¬†because one source gets updated without fully analyzing or realizing the full impact to a related or linked sources. Or, when two¬†disparate replications of another system exists (like we BI professionals¬†don’t know all about the "under the desk" solution, or¬†servers living out of IT and under our desk…Pish posh) but that are not linked, when one gets updated, the other produces inaccurate and out dated information, even if it the same team of IT or business intelligence folks¬†are responsible for¬†maintaining the sources of information.

    It is a concept known as systems synchronization. Companies mitigate this problem in a number of ways, and when syncing issues do occur, they analyze¬†and correct the issue¬†often escaping notice of the wider population in their attempts to align the data between sources. But there is always a time period in-between synchronization when both systems are out of sync and when groups of employees¬†are exposed to those same¬†systems and pieces of data, used to produce their manual ETL magic¬†to transform that data into readable information and ultimately,¬†to produce fancy business PowerPoint’s for leadership, often¬†telling the tale of 2 or more systems are intertwined in order to answer a question. Any question posed by anyone in the organization which can be generated and compiled into¬†data sets in a sorted¬†Information Symphony, knowing nothing of the¬†fact that 50% of the time, someone is going to roll the dice and land in the unaware zone, a land of growing data integrity issues¬†that has formed in one of those systems. And, even worse,¬†who is¬†the person to¬†say which source is the system of record or is even correct, once the damage is done. They will always question the validity of the data from that system; not out of anything but disbelief in the intelligence being propagated for their benefit. Results are significant, not minimal; the effect that this can have when the damage is done.

    In the end, it was the slightly different process causing the issue, and once it was changed to a centralized order er responsible for submitting to the warehouse, along with the fix to the out of sync problem on a more permanent basis: creating a centralized ordering system.

    However, without the skill of the ‘Jill of all trades’ I was once dubbed by a mentor along with ‘Crackerjack Analyst’, this company might have lost additional campaign related revenue from other botched campaigns. An analyst at heart is hard to find in employees, which is especially true and necessary within the business intelligence professionals of the world, though often neglected during standard hiring practices.

    I learned what I learned while I was an analyst, though appreciated it the least, as it wasn’t a respected technical¬†position within my org. In fact, I believe it was dubbed the lowest notch on the proverbial totem pole within the IT department. As a final¬†farewell to 6 years of my life and amazing growth in this field, I bid the greatest of thanks to Expedia for everything I walk away from, especially¬†the respect of my colleagues.

    Hopefully, readers of the world, this example will help you to see and share (in the form of comments) the real life BI problems and solutions examples dreamt up or affecting you. 

    To those ‘Crackerjack analysts’ out there…we are fewer and fewer by the day and have to watch out for each other. Our tenant is simple…To provide objective and meaniful data to enable decision making for the organization in which I am employed. Happy New Year to all and best wishes for making 2008 the year in which you shine.

    Part I – Business Intelligence vs. “the Product”

    How often do we hear individuals reference BI in terms of the product…"Business Objects" supplies our BI or "we are a MS shop" or¬†like many, you might have¬†a combination of products in house, all of which are used for 1 of¬†the many offered¬†features in combination with the other applications (also used for just 1 or 2 of the features that¬†can actually support all of one’s needs, if one took the time to learn before buying/building something shiny and new?
     
    I also hear things like "BI=BO" [ Business Objects]…"BI" [Business intelligence]
     
    What do ubiquitous words like ‘business intelligence’ mean anyway ? It is the product or the process or the end user usage of said product that matters?
     
    It is interesting to find out the responses when one reaches for answers. Ignorance is bliss, as they say, but knowledge is power.
     
     

    Open Source for BI with Scorecards and Dashboards- “…EIBIO”

    [Certain liberties have been taken to illustrate a point, and does not represent any of my corporate affiliations nor employment parties. Statements are intended to generalize particular workplace concepts in an effort¬†to reach a broad network of professional and interest readers,¬†and should always remain classified as a ‘focus group¬†one”s opinion – my own]


     Business Intelligence is a subject I think of so often, that I fall prey to believing that all business-minded folk are partaking of the same general thought РAnd while there some out there who are, (quick quiz to know if you are: Do you/Have you ever done/said ANY of the following:

    a) I love the new Excel 2007/Office 2007,

    b) I’m a data geek¬†¬† [women – you know who you are and how we use this line when we have a crush on someone in the technical realm of your workplace, especially as you refer to your quirky interest in a particular technology, software product or game ]¬†

    Aah, but I degress …

    or c) have ever been excited when selected to be a part of what feels like "the exclusive BETA club", a particular scenario that yields you, the end user, early exposure to a previously unreleased and new piece of software, often times involving scenarios where a technology company actually wants and solicits your feedback about what things they could change, add or remove before that product "RTMs" (releases to market).

    If you answered ‘Yes’ to any of the above, you are not the folks referred to above, you, who have a love so deep for business intelligence, or at least¬†if they do not love it, they hace a deep knowledge begrudingly earned over the years earning them a prosperous but unrewarding career. No, not you. It is the ones who haven’t a clue what BI is, not knowing that most people in the workplace are using, or have used/interfaces with a spreadsheet software, and that act, in essense, is BI. It is more than a fancy set of terms that when strung together form a very intelligent phrase – ‘Business Intelligence’ sounds smart…Right?

    Well, not to those who answered ‘Yes’ to the above quiz, like myself.

    But, if BI is more than just your fancy, ie.you work in the field, you will know that getting the generation that they dubbed the baby boom, from the data warehousing world, to see a new concept in a space they ‘own’, is almost impossible. Likewise, the propensity for the generation dubbed X, Y, or Z to slow down to understand the principles of warehousing is pretty low. No matter, there seems to be an equal likelihood for folks to feel the sense of what I call the "this is mine" syndrome –

    Often, when automation is introduced, albeitly a feeling I expereinced once in my career, this fear of ‘your taking away the core function of work I preform, so what will I do to be valuable¬†/ keep my job’ is at its highest. Employees will wrap their hands around whatever it is, consciously or not, and¬†both try to appear¬†as an¬†expert and capable¬†of producing a huge lift in¬†execution "i.e. jump-starting" an employee. They (I) can feel that they lack¬†the ability¬†in the eyes of their leadership, and may or may not become challenged to do more/better, or become resigned and less productive over time. But for me, within a matter of days of the project launching, I felt incredibly connected to the new automation project requirements gathering, thus, driving how the solution to save me time in my day, would function, serve needs and deliver value.

    Thinking broader about this experience brings me back to Open Source BI, and specifically, thinking about how to¬†build a simple reporting¬†architecture into a¬†robust suite of a data visualization applications and methods.¬†Think of an Excel workbook,¬†which¬†includes 3 worksheets by default. Each worksheet can hold tables,¬†images, graphs, charts and other¬†data visualization techniques. Think about an example analysis that has a chart, a table supporting the chart, supplemental text and drop down boxes with filter for report authorer consistency for reporting — this has 4 data visualization elements to it and was¬†very specific to a certain analysis I had in mind (Time and Motion study) —

    Now, consider if I was 1 of 10 analysts in a company. If all 10 of us analysts each built a report using the same number of elements from above, or 4, 10 analysts x 4 elements = 40 possible data visualization elements (exponentially increased by the combinations of any 1 of those elements). Think of how much of  that could possibly be inconsistent depending on the report developer/analyst subjectivity picked up by working in one operating unit over another.

    Now, think about the challenge of creating a consistent BI platform, that allows end users to create a customized report based on varying needs for data.

    It’s not easy – Companies that¬†grow rapidly¬†often experience the pain¬†of having ‘multiple versions of the truth’ floating around due to the many spreadsheets¬†introduced for the same number/figure/KPI recorded. ¬†Disparate databases make for an ever greater challenge as one strives to report on information intelligence.

    What if we could offer you a blank slate, tabula rosa if you will, where each element or each thing,. product or tool that you have access to in your company, became a widget, a little tool you could enable and use as part of your larger blank slate screen. You add a graph to the top left, and a chart to the top r, connecting both elements to a data source like a Business Object Universe, or a MOLAP cube, or Relational db, or flat file like another excel workbook. Who cares…depending on your company, you may have many options or a few. In essense you are connecting your one ‘Tabula Rosa" : to: many potential data sources. Then, you add a suuplemental table, a hyperlink to a printable view (of the databoard you are building on the fly) and an ‘Export to Powerpoint, Word, Excel, Crystal, .SSRS (Reporting Services,. WebI – whatever, you get the point), all in one interface, like a simple ASP webpage, called a ‘dashboard’.

    Now imagine each widget was built from code (well, this is real not imagined), and that the code, while protected in a code repository like Source Safe or Perforce from permanent deletion and other code securing modes, was offered to you, your developer or consultant, for ease of access. Then, imagine we offered you a ‘sandbox’ or server for you to publish your BI elements to, thus growing our overall BI web services, content etc. Much like a wiki like Wikipedia does to collect it’s information.

    No matter the technology, those services exposed could be extracted for use in this application, connectors and all. While I know it is awhile before we can make this happen, it we started thinking about it now, while offering a more iterative approach to data mart development (ie Рhow useful is it for a BI team to function without a decent data warehouse,. or at least, a quick response development cycle) following the standards as necessary but deliverying in project execution. Add onto of that a solid engagement model, and a robust and customizable series of dashboards.

    This is merely scratching the surface of the next gen BI that gets me really exicted.

    If Pervasive BI is to Companies as CRM was in the 90’s, than…

    Considered a potential¬†BI Pioneer by Microsoft, and someone who cares deeply about the CPM trend, I can say that¬†there are folks in the world who¬†make the assertion that SOX compliance¬†and BI/CPM¬†should be naturally wedded in order to make the harmonious accord within the workplace —¬†Yet good¬†points made through a¬†Social Media Marketing site like LinkedIn should be¬†taken with a grain of salt,¬†regarding how SOX is perceived at the CxO level, albeit for the most part I agree.

    To add on to this question posed on LinkedIn’s new answers section, ¬†part of this is due to how the "sox teams" used the term negatively to drive so much fear into developers, business, finance teams and more, that people became insensitive to the actually very important protection that SOX implicitly offers. As they say, "he/she who cried wolf…"

    Secondly, folks used "sox-related" to drive projects through the pipeline at a higher priority that those previously assigned as business valuable / revenue generating / cost saving. And later, as the SOX dust settled and established procedures and policies went into place, some of those projects were sunsetted ("put on the shelf") when companies with decent compliance in place didnt see many of the promised returns from passing SOX the "1st time." But I digress –

    If pervasive BI is to a company what CRM was in the 90’s,¬† and CPM is to what TQM promised when applied to your human resources, then SOX feels like it will naturally evolve into areas like BI / Compliance reporting more thoroughly and implicitly associated with true performance management. In fact, balanced scorecards have even adopted more of a compliance look in feel within the Internal perspective. Or at least, that is me reading tea leaves…

     

    Will SOA for BI Enable More Points of Intersection Between Business Intelligence & Six Sigma?

    For those of you own are either unaware that I blog on isixsigma’s blogosphere or are too unimpressed with the concept of multiple Internet browser sessions (yes, there are some still not leveraging our¬†Internet friend, Firefox), here is my most recent blog:
     
    "

    To go to Blogosphere, click here or read on :

    As my title would suggest, the concepts of business intelligence and Six Sigma seem to be revolve around each other according to most practioners, rather than intersect, like I believe. Gone are the days (or should be) where the applications and business processes are distinct, individual units operating outside the realm of process improvement projects. Wait, back up — If SOA, or Service Oriented Architecture, is by textbook defined as the ‘archtectural retooling of software that allows for the exploitation of open standards that have been adopted by software companies (page 6, ICMI Call Center Magazine, June 2007).’ Woah – What?

    In other words, SOA allows various silo’ed business processes to work together through the use of the middleware concept that presents the end-user with a standard user interface, but allows the same end user the power to custom-configure these applications based on their own unique needs. Think of a Chinese restaurant menu’s family dinner, where you can mix and match from a variety of delicacies in order to create the most perfect family meal for you and your party. Likewise, Service Oriented Architecture does the same, except you are creating your most perfect picture of your business through the use of intelligence, reporting, scorecarding / dashboarding (also known as Enterprise Performance Management).

    What is the key though to the intersection and why am I writing about this in a six sigma online community forum? Think of all of the disparate business processes that we work on everyday, and think of all of the projects you have worked on in which you a) had to fight to prove your benefits saved with finance (especially if your company allows soft benefits), b) needing to produce reports as a means of delivering numbers to the process owner and champion on the success of your outputs as defined by your control plan over time, c) needed to validate whether your project even warranted a black or green belts time by leveraging business repots to do so or d) like myself, just have a keen love for the colors or red, yellow and green when used to measure the health of the business.

    Ok, I suspect there are fewer of us in the D category, but I may be wrong – My hypothesis is that most of you out there fall into a, b or c or a combination of such characters depending on your business. I am also assuming that you have some type of underlying database structures before you launch six sigma in general, otherwise, I might pull my project in lieu of working on setting up an enterprise data warehouse, but that is a whole other can¬†of worms for another blog, for another day. I diverge…¬†And now we are back…

    If any of the options above describe you, then you are in need or wanting business intelligence. And, since my example was scoped to helping you establish your project or report on the success of a project, I suspect you see the analogy I am trying to make when I say there is an intersection between business intelligence and six sigma. In any effort to create consistency or less variation, which is a tenant of the DMAIC methodology, one should gravitate towards the learnings of SOA for BI — either as a poke yoke or SPC/SPM vehicle, a savings validation or a time / project management tracker; subject matter doesn’t matter – what does is an implicit understanding in the principles at large. It is no longer limited to just IT; it is all of our responsibility…"¬†

    Stop the Silos — > Open Source for Corporate Business Intelligence – Part 1

     

    Much has not been discussed recently on the growing corporate silos of business users who are taking back control of their own IT legacy systems and budgeting within their own budgets for future enhancements or even worse, new applications all together. This is particularly true within the BI space. Having moved to "the dark side" only a month ago, but well versed in the BI space, this was extremely true for my own personal migration. Having grown tired of waiting for project prioritization to bubble up my project request for scorecards, but before going truly rouge and renegade in my approach, I went on an approval campaign about 3 years ago now. Huh?

     

    An Approval Campaign Рwhere a business user goes IT group to IT group, asking whether they would a) be interested in building a scorecard app or any app for that matter, b) could do it within my 1 year time frame, c) could REALLY do it within my 1 year time frame and d) would be willing to support it within an IT regulated production environment. D was the only taker, and really is the only one that really is a win/win for both parties.

     

    What the approval campaign gives you is a CYA mechanism;¬†the ability to point out years later when the label "under the desk shadow app" gets placed on your pride and joy platform, that you DID, in fact, seek out the ‘proper channels’ and was turned away by the not so subtle use of lower priority terminology to basically elicit a response similar to a canine’s hackles going up when one climbs into their territory.¬†While it may seem innocuous, it is anything but that. It is a way for IT to scapegoat, to blame, and cast aside any wrong doing in service disruptions; to enable them to¬†point fingers and deny any culpability to the eventual service¬†issues that will¬†arise¬†( and believe me they will). This is exponentially compounded by those orgs who have undergone¬†a greater than average mass exodus of employees, aka your corporate attrition, without a solid means of knowledge sharing or succession planning on the parts of HR, the employees or managers enforcing those departures.

     

    Seems like a conundrum beyond means of course correction, or does it?

     

    Let’s start to draw some parallels with the concepts of ‘Open Source’ and ‘SOA for BI’ in terms of a fable that I am going to share with you:

     

    ~ Once upon a time, in a land far, far away, there were many software companies and consortiums, who would hire very intelligent engineers to help them build their ‘visions’ of the future. Their constituents, "the business", kept changing priorities and drove this ‘flavor of the month’ sense to development that no one knew what to concentrate on, and thus, all did it ‘a li’l bit different’ — standards, what standards, was our core value, along with the notion that documentation is for the weak. Some people got so frustrated with the hard coded system integration and recoding of work, that they left the kingdom, and so left the knowledge of the architecture supporting the kingdom. Even worse, the evil dragon named nonstandardized, continued to burn any of the attempts to create pocketed standardization guidelines because their wasn’t any support beyond the fiefdom’s attempts at the kingdom level.

     

    But all was not hopeless in the land of spaghetti coded platforms – Oh no… Why? Because as long timers left, so came newbie’s with their unabashed impression that by architecting a retooling of software like our friends next door in the land of best practices were doing, levering¬†open standards, modularized¬†applications as services¬†with standardized interfaces¬†and most importantly, a means to address the evil King, ‘Disparate business process’.

     

    And lo and behold, one day, a prince, named middle-ware, came in, riding on his noble widget, breaking down anyone claiming "but we’ve always done it that way" in his path. His ride took him from software land to software land, breaking down silos, and smoothing business processes in his way, that were once isolated and forced into their siloness due to the technology barriers that existed between the loyal tenants of this kingdom’s applications.

     

    And he dropped his widget onto a vast and empty space that had standardized¬†the look and feel to¬†resemble all other spaces¬†schema. But instead of mirroring the space next door, who placed his hub¬†widget in the middle of the space, leveraging all offered supporting informational spokes¬†around the perimeter, he placed it and nothing else at the top of the space, and called his new land, the land of ‘Scorecard’ — he enabled the same level of information for his people by linking and drilling deeper into this primary scorecard widget, and by doing so, creating a space worthy of the King’s time (short, sweet and to the point), but also gave a path to deep dive into the meaning of his widget for those users more accustomed to seeing more, not less information."

     

    And people started to follow. From one space to the next, the kingdom underwent a radical face lift, and together the silos, without realizing they were conforming, started to conform; but they didn’t live happily ever after….

     

    Why?

     

    Because the root cause wasn’t addressed with why the silos initiatives happen in the 1st place – Thus, the open source concept that I am so passionate about enters. Little has truly been discussed about this space….Part 2 will talk about how to leverage an open source platform with an SOA for BI infrastructure and by doing so will not only stop the business from their own silo’ed initiatives but also, give your developers the ammunition to build and customize apps while adhering to your set BI standards and guidelines.

     

    PS – Did I say I love my new job…?

     

    Business Intelligence Services

    Corporate Performance Management (CPM) has long been an area that has been a triumphant defeat or a winning success – but there gray area in between, not so much…so I started thinking about what makes one company successful while driving non success in others…

    When most companies start some type of performance management program, they do not even know that is what they are doing / launching. For example, if any of you are reporting on company health metrics (i.e. revenue, etc)¬† or cost drivers (i.e. COGS) and tying it back to the plan using a tool like MS Excel, then you, too, are buying into CPM. Or are you…?

    That was a bit unfair of me because it was meant to be a trick question. While you are reporting on health metrics and using a BI family tool, you are most certainly not operating a CPM system.

    Why?

    Even though you are company to company targets or plan, is it tied to the strategy? Do the company goals cascade down to the employee level and I mean all the way down to the call center agent (if you operate one internally). Do you employees conduct their day to day activities prioritized off those same goals that the CEO uses?

    Next, I will walk you though how to take your MS Excel view into a CPM view with minimal cost / maximum potential for returns.

    …I’m back…