How Do You Use LinkedIn? (Social Media Infographics)

How often do you refresh your LinkedIn profile pic? Or worse, the content within your profile? Unless you are a sales exec trolling the social networking site or a job seeker, I would surmise not that often; in fact, rarely is most apropos of a description. Thoughts…? ( yes, she’s back ( again), but this time, for good dear readers…@Laura_E_Edell (#infographics) says thanks to designinfographics.com for her latest content postings!

And just because I call it out, doesn’t mean you will know the best approach to updating your LinkedIn profile. And guess what …there’s an infographic for that! (http://www.linkedin.com/in/lauraerinedell)

Check out my profile on LinkedIn by clicking infographic

Check out my profile on LinkedIn by clicking infographic

Futures According to Laura… Convergence of Cloud and Neural Networking with Mobility and Big Data

It’s been longer and longer between my posts and as always, life can be inferred as the reason for my delay.

But I was also struggling with feeling a sense of “what now” as it relates to Business Intelligence.

Many years ago, when I first started blogging, I would write about where I thought BI needed to move in order to remain relevant in the future. And those futures have come to fruition lately. Gamuts ranging from merging social networking datasets into traditional BI frameworks to a more common use case of applying composite visualizations to data (microcharts, as an example). Perhaps more esoteric was my staunch stance on the Mobile BI marriage which when iPhone 1 was released was a future many disputed with me. In fact, most did not own the first release of the iPhone, and many were still RIM subscribers. And it was hard for the Blackberry crowd to fathom a world unbounded by keyboards and scroll wheels and how that would be a game changer for mobile BI. And of course, once the iPad was introduced, it was a game over moment. Execs everywhere wanted their iPads to have the latest and greatest dashboards/KPIs/apps. From Angry Birds to their Daily Sales trend, CEOs and the like had new brain candy to distract them during those drawn out meetings. And instead of wanting that PDF or PowerPoint update, they wanted to receive the same data on their iPad. Once they did, they realized that having the “WHAT” is happening understanding was only the crack to get them hooked for a while. Unfortunately, the efficacy of KPI colors and related numbers only satisfies the one person show – but as we know, it isn’t the CEO who analyzes why a RED KPI indicator shows up. Thus, more levels of information (beyond the “WHAT” and  “HOW OFTEN”)  were needed to answer the “WHY” and “HOW TO FIX” the underlying / root cause issue.

The mobile app was born.

It is the reborn mobile dashboard that has been transformed into a new mobile workflow, more akin to the mobile app. 

But it took time for people to understand the marriage between BI dashboards, the mobile wave, especially the game change that Apple introduced with it’s swipe and pinch to zoom gestures, the revolution of the App Stores for the “need to have access to it now” generation of Execs, the capability to write-back from mobile devices to any number of source systems and how functionally, each of these seemingly unrelated functions would and could be weaved together to create the next generation of Mobile Apps for Business Intelligence. 

But that’s not what I wanted to write about today. It was a dream of the past that has come to fruition. 

Coming into 2013, cloud went from being something that very few understood to another game changer in terms of how CIOs are thinking about application support of the future. And that future is now.

But there are still limitations that we are bound by. Either we have a mobile device or not, either it is on 3 or 4G or wifi. Add to that our laptops (yes, something I believe will not dominate the business world in a future someday). And compound that with other devices like smartphones, eReaders, desktop computers et al. 

So, I started thinking about some of the latest research regarding Neural Networks (another set of posts I have made about the future of communication via Neural networks) published recently by Cornell University here (link points to http://arxiv.org/abs/1301.3605).

And my nature “plinko” thought process (before you ask, search for the Price is Right game and you will understand “Plinko Thoughts”) bounced from Neural Networks to Cloud Networks and from Cloud Networks to the idea of a Personal Cloud. 

A cloud of such personal nature that all of our unique devices are forever connected in our own personal sphere and all times when on our person. We walk around and we each have our own personal clouds. Instead of a mass world wide web, we have our own personal wide area network and our own personal wide web.

When we interact with other people, those people can choose to share their Personal networks with us via Neural Networking or some other sentient process, or in the example, where we bump into a friend and we want to share details with them, all of our devices have the capability to interlink to each other via our Personal Clouds. 

Devices are always connected to your Personal Cloud which is authenticated to your person, so that passwords which are already reaching their shelf life (see: article for more information on this point), are no longer the annoying constraint when we try to seamlessly use our mobile devices while on the go. Instead, they are authenticated to our Personal Cloud following similar principles as where IAM (Identity and Access Management) is moving towards in future. And changes in IAM are not only necessary for this idea to come to fruition but are on the horizon.

In fact, Gartner published an article in July 2012, called “Hype Cycle for Identity and Access Management Technologies, 2012″ in which Gartner recognized that the growing adoption of mobile devices, cloud computing, social media and big data were converging to help drive significant changes in the identity and access management market.

For background purposes, IAM processes and technologies work across multiple systems to manage:

■ Multiple digital identities representing individual users, each comprising an identifier (name or key) and a set of data that represent attributes, preferences and traits

■ The relationship of those digital identities to each user’s civil identity

■ How digital user identities communicate or otherwise interact with those systems to handle
information or gain knowledge about the information contained in the systems

If you extrapolate that 3rd bullet out, and weave in what you might or might not know/understand about Neural Networking or brain-to-brain communication (see recent Duke findings by Dr. Miguel Nicolelis here) (BTW – the link points to http://www.nicolelislab.net/), one can start to fathom the world of our future. Add in cloud networking, big data, social data and mobility, and perhaps, the Personal Cloud concept I extol is not as far fetched as you initially thought when you read this post. Think about it.

My dream like with my other posts is to be able to refer back to this entry years from now with a sense of pride and “I told you so.” 

Come on – any blogger who makes predictions which come true years later deserves some bragging rites. 

Or at least, I think so…

MicroStrategy World 2012 – Miami

Our internal SKO (sales kick off) meeting was the beginning of this years’ MSTR World conference ( held in Miami, FL at the Intercontinental Hotel located on Chopin Plaza). As with every year, the kickoff meeting is the preliminary gathering of the salesforce in an effort to “rah-rah” the troops who work the front lines around the world ( myself included).

What I find most intriguing is the fact that MicroStrategy is materializing for BI all of those pipe dreams we ALL have. You know the ones I mean : I didn’t buy socialintelligence.co for my health several years ago. It was because I saw the vision of a future where business intelligence and social networking were married. Or take cloud intelligence, aka BI in the cloud. Looking back in 2008, I remember my soapbox discussion of BI mashups, ala My Google, supported in a drag and drop off premises environment. And everyone hollered that I was too visionary, or too far ahead. That everyone wanted reporting, and if I was lucky, maybe even dashboards.

But the acceleration continued, whether adoption grew or not.

Then, i pushed the envelope again: I wanted to take my previous thought of the mashup a morph it into an app integrated with BI tools. Write back to transactional systems or web services was key.

What is a dashboard without the ability within the same interface to take action? Everyone talks about actionable metrics/KPIs. Well, I will tell you that to have a KPI BY DEFINITION OF WHAT A KPI IS, means it is actionable.

But making your end users go to a separate ERP or CRM, to make the changes necessary to affect a KPI, will drive your users away. What benefit can you offer them in that instance ? Going to a dashboard or an excel sheet is no different. It is 1 application to view and if they are lucky, to analyze their data. If they were using excel before , they will still be using excel, especially if your dashboard isn’t useful to day to day operations.

Why? They still have to go to a 2nd application to take action.

Instead, integrate them into one.

Your dashboard will become meaningful and useful to the larger audience of users.
Pipe dream right?

NO. I have proved this out many times now and it works.

Back in 2007-2008, it was merely a theory I pontificated with you, my dear readers.

Since then, I have proved it out several times over and proven the success that can be achieved by taking that next step with your BI platforms.

Folks, if you haven’t done it, do it. Don’t waste anymore time. It took me less then 3 days to write the web services code to consume the salesforce APIs including chatter, ( business “twitter” according to SFDC), into my BI dashboard ( mobile dashboard in fact).

And suddenly, a sales dashboard becomes relevant. No longer does the salesforce team have to view their opportunities and quota achievement in one place, only to leave or open a new browser, to access their salesforce.com portal in order to update where they are at mid quarter.

But wait, now they forgot which KPIs they need to add comments to because they were red on the dashboard which is now closed, and their sales GM is screaming at them on the phone. Oh wait…they are on the road while this is happening and their data plan for their iPad has expired and no wireless connection is found.

What do you do?

Integrating salesforce.com into their dashboard eliminates at least one step (opening a new browser) in the process. Offering mobile offline transactions is a new feature of MicroStrategy’s mobile application. This allows those sales folks to make the comments they need to make while offline, on the road , which will be queued until they are online again.

One stop, one dashboard to access and take action through, even when offline, using their mobile ( android, iPad/iPhone or blackberry ) device.

This is why I’m excited to see MicroStrategy pushing the envelope on mobile BI futures.

MicroStrategy Personal Cloud – a Great **FREE** Cloud-based, Mobile Visualization Tool

Have you ever needed to create a prototype of a larger Business Intelligence project focused on data visualizations? Chances are, you have, fellow BI practitioners. Here’s the scenario for you day-dreamers out there:

Think of the hours spent creating wire-frames, no matter what tool you used, even if said tool was your hand and a napkin (ala ‘back of the napkin’ drawing) or the all-time-favorite white board, which later becomes a permanent drawing with huge bolded letters to the effect of ‘DO NOT ERASE OR ITS OFF WITH YOUR HEAD’ annotations dancing merrily around your work. Even better: electronic whiteboards which yield you hard copies of your hard work (so aptly named), which at first, seems like the panacea of all things cool (though it has been around for eons) but still, upon using, deemed the raddest piece of hardware your company has, until, of course, you look down at the thermal paper printout which has already faded in the millisecond since you tore it from machine to hand, which after said event, leaves the print out useless to the naked eye, unless you have super spidey sense optic nerves, but now I digress even further and in the time it took you to try to read thermal printout, it has degraded further because anything over 77 degrees is suboptimal (last I checked we checked in at around 98.6 but who’s counting), thus last stand on thermal paper electronic whiteboards is that they are most awesome when NOT thermoregulate ;).

OK, and now We are back…rewind to sentence 1 –

Prototyping is to dashboard design or any data visualization design as pencils and grid paper are to me. Mano y mano – I mean, totally symbiotic, right?

But, wireframing is torturous when you are in a consultative or pre-sales role, because you can’t present napkin designs to a client, or pictures of a whiteboard, unless you are showing them the process behind the design. (And by the way, this is an effective “presentation builder” when you are going for a dramatic effect –> ala “first there were cavemen, then the chisel and stone where all one had to create metrics –> then the whiteboard –> then the…wait!

This is where said BI practitioner needs to have something MORE for that dramatic pop, whiz-AM to give to their prospective clients/customers in their leave behind presentation.

And finally, the girl gets to her point (you are always so patient, my loving blog readers)…While I biased, if you forget whom I work for, and just take into account the tool, you will see the awesomeness that the new MicroStrategy Personal Cloud is for (drum roll please) PROTOTYPING a new dashboard — or just building, distributing, mobilizing etc your spreadsheet of data in a highly stylized, graphical means that tell a story far better than a spreadsheet can in most situations. (Yes, neighseyers, I know that for the 5% of circumstances which you can name, a spreadsheet is more àpropos, but HA HA, I say: this cloud personal product has the ability to include the data table along with the data visualizations!)

Best of all it is free.

I demoed this recently and was able to time it took to upload and spreadsheet, render 3 different data visualizations, generate the link to send to mobile devices (iPads and iPhones), network latency for said demo-ees to receive the email with the link and for them to launch the dashboard I created, and guess what the total time was?

Next best of all, it took only 23.7 minutes from concept to mobilization!

Mind you, I was also using data from the prospect that I had never seen or had any experience with.

OK, here is how it was done:

1) Create a FREE account or login to your existing MicroStrategy account (by existing, I mean, if you have ever signed up for the MicroStrategy forums or discussion boards, or you are an employee, then use the same login) at https://www.microstrategy.com/cloud/personal

Cloud Home

Landing Page After Logged in to Personal Cloud

2) Click the button to Create New Dashboard:

Create Dashboard Icon

  • Now, you either need to have a spreadsheet of data OR you can choose one of the sample spreadsheets that MicroStrategy provides (which is helpful if you want to see how others set up their data in Excel, or how others have used Cloud personal to create dashboards; even though it is sample data , it is actually REAL data that has been scrub-a-dub-dubbed for your pleasure!) If using a sample data set, I recommend the FAA data. It is real air traffic data, with carrier, airport code, days of the week, etc, which you can use to plan your travel by; I do…See screenshot below. There are some airports and some carriers who fly into said airports whom I WILL not fly given set days of the week in which I must travel. If there is a choice, I will choose to fly alternate carriers/routes. This FAA data set will enable you to analyze this information to make the most informed decision (outside of price) when planning your travel. Trust me…VERY HELPFUL! Plus, you can look at all the poor slobs without names sitting at the Alaska Air gate who DIDNT use this information to plan their travel, and as you casually saunter to your own gate on that Tuesday between 3 – 6 PM at SeaTac airport , you will remember that they look so sad because their Alaska Air flight has a 88% likelihood of being delayed or cancelled. (BTW, before you jump on me for my not so nice reference to said passengers), it is merely a quotation from my favorite movie ‘Breakfast at Tiffany’s’ …says Holly Golightly: “Poor cat…poor old slob without a name”.

On time Performance (Live FAA Data)

If using your own data, select the spreadsheet you want to upload

3) Preview your data; IMPORTANT STEP: make sure that you change any fields which to their correct type (either Attribute or Metric or Do Not Import).

Cloud Import - Preview Data

Keep in mind the 80/20 rule: 80% of the time, MicroStrategy will designate your data as either an Attribute or Metric correctly using a simple rule of thumb: Text or VarChar/NVarChar if using SQL Server, will always be designated as an Attribute (i.e. your descriptor/Dimension) and your numerals designated as your Metrics. BUT, if your spreadsheet uses ID fields, like Store ID, or Case ID, along with the descriptor like Store DESC or Case DESC, most likely MicroStrategy will assume the Store ID/Case ID are Metrics (since the fields are numeric in the source). This is an Easy Change! You just need to make sure ahead of time to make that change using the drop down indicator arrows in the column headings – To find them, hover over the column names with your mouse icon until you see the drop down indicator arrow. Click on the arrow to change an Attribute column to a Metric column and vice-versa (see screenshot):

Change Attribute to Metric

Once you finish with previewing your data, and everything looks good, click OK at the bottom Right of your screen.

In about 30-35 seconds, MicroStrategy will have imported your data into the Cloud for you to start building your awesome dashboards.

4) Choose a visualization from the menu that pops up on your screen upon successfully importing your spreadsheet:

Dashboard Visualization Selector
Change data visualization as little or as often as you choose

Here is the 2010 NFL data which I uploaded this morning. It is a heatmap showing the Home teams as well as any teams they played in the 2010 season. The size of the box is HOW big the win or loss was. The color indicates whether they won or lost (Green = Home team won // Red = Home team lost).

For all you, dear readers, I bid you a Happy New Year. May your ideas flow a plenty, and your data match your dreams (of what it should be) :). Go fearlessly into the new world order of business intelligence, and know that I , Laura E. your Dashboard Design Diva, called Social Intelligence the New Order, in 2005, again in 2006 and 2007. :) Cheers, ya’ll.

http://tinyurl.com/ckfmya8

https://my.microstrategy.com/MicroStrategy/servlet/mstrWeb?pg=shareAgent&RRUid=1173963&documentID=4A6BD4C611E1322B538D00802F57673E&starget=1

Continue reading

Business Intelligence Clouds – The Skies the Limit

I am back…(for now, or so it seems these days) – I promise to get back to one post a month if not more.

Yes, I am known for my frequent use of puns, bordering on the line between cheesy and relevant. Forgive the title. It has been over 110 days since I last posted, which for me is a travesty. Despite my ever growing list of activities both professional and personally, I have always put my blog in the top priority quadrant.

Enough ranting…I diverged; and now I am back.

Ok, cloud computing (BI tools related) seems to be all the rage. Right up there with Mobile

BI, big data and social. I dare use my own term coined back in 2007 ‘Social Intelligence’ as now others have trade marked this phrase (but we, dear readers, know the truth –> we have been thinking about the marriage between social networks / social media data sets and business intelligence for years now)…Alas, I diverge again. Today, I have been thinking a lot about cloud computing and Business Intelligence.

Think about BI and portals, like Sharepoint (just to name 1)…It was all of the rage (or perhaps, still is)…”Integrate my BI reporting with my intranet / portal /Sharepoint web parts…OK, once that was completed successfully, did it buy much in terms of adoption or savings or any number of those ROI / savings catch – “Buy our product, and your employees will literally save so much time they will be basket weaving their reports into TRUE analysis’” What they didnt tell you, was that more bandwidth meant less need for those people, which in turn, meant people went into scarcity mode/tactics trying to make themselves seem or be relevant…And I dont fault them for this…Companies were not ready or did not want to think about what they were going to do with the newly freed up resources that they would have when the panacea of BI deployments actually came to fruition…And so, the wheel turned. What was next…? Reports became dashboards; dashboards became scorecards (became the complements for the former); Scorecards introduced proactive notification / alerting; alerting introduced threshold based notification across multiple devices/methods, one of which was mobile; mobile notification brought the need for mobile BI –> and frankly, and I will say it: Apple brought us the hardware to see the latter into fruition…Swipe, tap, double tap –> drill down was now fun. Mobile made portals seem like child’s play. But what about when you need to visualize something and ONLY have it on a spreadsheet?

(I love hearing this one; as if the multi-billion dollar company whose employee is claiming to only have the data on a spreadsheet didnt get it from somewhere else; I know, I know –> in the odd case, yes, this is true…so I will play along)…

The “only on a spreadsheet” crowd made mobile seem restrictive; enter RoamBI and the likes of others like MicroStrategy (yes, MicroStrategy now has a data import feature for spreadsheets with advanced visualizations for both web and mobile)…Enter Qlikview for the web crowd. The “I’m going to build-a dashboard in less than 30 minutes” salesforce “wait…that’s not all folks….come now (to the meeting room) with your spreadsheet, and watch our magicians create dashboards to take with you from the meeting”

But no one cared about maintenance, data integrity, cleanliness or accuracy…I know…they are meant to be nimble, and I see their value in some instances and some circumstances…Just like the multi-billion dollar company who only tracks data on spreqadsheets…I get it; there are some circumstances where they exist…But, it is not the norm.

So, here we are …mobile offerings here and there; build a dashboard on the fly; import spreadsheets during meetings; but, what happens when you go back to your desk and have to open up your portal (still) and now have a new dashboard that only you can see unless you forward it out manually?

Enter cloud computing for BI; but not at the macro scale; let’s talk , personal…Personal clouds; individual sandboxes of a predefined amount of space which IT has no sanction over other than to bless how much space is allocated…From there, what you do with it is up to you; Hackles going up I see…How about this…

Image representing Salesforce as depicted in C...
Image via CrunchBase

Salesforce.com –> The biggest CRM cloud today. And for the last many years, SFDC has

enbraced Cloud Computing. And big data for that matter; and databases (database.com in fact) in the cloud…Lions and tigers and bears, oh my!

So isnt it natural for BI to follow CRM into cloud computing ?? Ok, ok…for those of you whose hackles are still up, some rules (you IT folks will want to read further):

Rules of the game:

1) Set an amount of space (not to be exceeded; no matter what) – But be fair and realistic; a 100 MB is useless; in today’s world, a 4 GB zip drive was advertised for $4.99 during the back to school sales, so I think you can pony up enough to help make the cloud useful.

2) If you delete it, there is a recycling bin (like on your PC/Mac); if you permanently delete it, too bad/so sad…We need to draw the line somewhere. Poor Sharepoint admins around the world are having to drop into STSADM commands to restore Alvin Analyst’s Most Important Analysis that he not only moved into recycling bin but then permanently deleted.

3) Put some things of use in this personal cloud at work like BI tools; upload a spreadsheet and build a dashboard in minutes wiht visualizations like the graph matrix (a crowd pleasure) or a time series slider (another crowd favorite; people just love time based data :) But I digress (again)…

4) Set up BI reporting on the logged events; understand how many users are using your cloud environment; how many are getting errors; what and why are they getting errors; this simple type of event based logging is very informative. (We BI professionals tend to overthink things, especially those who are also physicists).

5) Take a look at what people are using the cloud for; if you create and add meaningful tools like BI visualizations and data import and offer viewing via mobile devices like iPhone/iPad and Android or web, people will use it…

This isnt a corporate iTunes or MobileMe Cloud; this isnt Amazon’s elastic cloud (EC2). This is a cloud wiht the sole purpase of supporting BI; wait, not just supporting, but propelling users out of the doldrums of the current state of affairs and into the future.

It’s tangible and just cool enough to tell your colleagues and work friends “hey, I’ve got a BI cloud; do you?”

BIPlayBook.Com is Now Available!

As an aside, I’m excited to announce my latest website: http://www.biplaybook.com is finally published. Essentially, I decided that you, dear readers, were ready for the next step.  What comes next, you ask?

After Measuring BI data –> Making Measurements Meaningful –> and –>Massaging Meaningful Data into Metrics, what comes next is to discuss the age-old question of ‘So What’? & ‘What Do I Do About it’?

BI PlayBook offers readers the next level of real-world scenarios now that BI has become the nomenclature of yesteryear & is used by most to inform decisions. Basically, it is the same, with the added bonus of how to tie BI back into the original business process, customer service/satisfaction process or really any process of substance within a company.

This is quite meaningful to me because so often, as consumers of goods and services, we find our voices go unheard, especially when we are left dissatisfied. Can you muster the courage to voice your issue (dare I say, ‘complain’?) using the only tools provided: poor website feedback forms, surveys or (gasp) relaying our issue by calling into a call center(s) or IVR system (double gasp)? I don’t know if I can…

How many times do we get caught in the endless loop of an IVR, only to be ‘opted-out’ (aka – hung up on) when we do not press the magical combination of numbers on our keypads to reach a live human being, or when we are sneaky, pressing ’0′ only to find out the company is one step ahead of us, having programmed ’0′ to automatically transfer your call to our friend:  ‘ReLisa Boutton’ – aka the Release Button().

Feedback is critical, especially as our world has become consumed by social networks. The ‘chatter’ of customers that ensues, choosing to ‘Like’ or join our company page or product, or tweet about the merits or demerits of one’s value proposition, is not only rich if one cares about understanding their customer. But, it is also a key into how well you are doing in the eyes of your customer. Think about how many customer satisfaction surveys you have taken ask you whether or not your would recommend a company to a friend or family member.

This measure defines one’s NPR, or Net Promoter Rank, and is a commonly shared KPI or key performance indicator for a company.

Yet, market researchers like myself know that what a customer says on a survey isn’t always how they will behave. This discrepancy between what someone says and what someone does is as age-old as our parents telling us as children “do not as I do, but as I say.” However, no longer does this paradigm hold true. Therefore, limiting oneself by their NPR score will restrict the ability to truly understand one’s Voice of the Customer. And further, if you do not understand your customer’s actual likelihood to recommend to others or repeat purchase from you, how can you predict their lifetime value or propensity for future revenue earnings? You can’t.

Now, I am ranting. I get it.

But I want you to understand that social media content that is available from understanding the social network spheres can fill that gap. They can help you understand how your customers truly perceive your goods or services. Trust me, customers are more likely to tweet (use Twitter) to vent in 140 characters or less about a negative experience than they are to take the time to fill out a survey. Likewise, they are more likely to rave about a great experience with your company.

So, why shouldn’t this social ‘chatter’ be tied back into the business intelligence platforms, and further, mined out specifically to inform customer feedback loops, voice of the customer & value stream maps, for example?

Going one step further, having a BI PlayBook focuses the attention of the metric owners on the areas that needs to be addressed, while filtering out the noise that can detract from the intended purpose.

If we are going to make folks responsible for the performance of a given metric, shouldn’t we also help them understand what is expected of them up front, as opposed to when something goes terribly wrong, signified by the “text message” tirade of an overworked CEO waking you out of your slumber at 3 AM?

Further, understanding how to address an issue, who to communicate to and most importantly, how to resolve and respond to affected parties are all part of a well conceived BI playbook.

It truly takes BI to that next level. In fact, two years ago, I presented this very topic at the TDWI Executive Summit in San Diego (Tying Business Processes into your Business Intelligence). While I got a lot of  stares ala ‘dog tilting head to the side in that confused glare at owner look’, I hope people can draw back on that experience with moments of ‘ah ha – that is what she meant’ now that they have evolved ( a little) in their BI maturation growth.

Gartner BI Magic Quadrant 2011 – Keeping with the Tradition

Gartner Magic Quadrant 2011

Gartner Magic Quadrant 2011

I have posted the Gartner Business Intelligence ‘BI’ Magic Quadrant (in addition to the ETL quadrant) for the last several years.  To say that I missed the boat on this year’s quadrant is a bit extreme folks, though for my delay, I am sorry. I did not realize there were readers who counted on me to post this information each year.  I am a few months behind the curve on getting this to you, dear readers.  But, what that said, it is better late, than never, right?

Oh, and who is really ‘clocking’ me anyway, other than myself? But that is a whole other issue for another post, some other day.

As an aside, am excited to say that my latest websites http://www.biplaybook.com is finally published. Essentially, I decided that the next step after Measuring BI data, Making the Measurements Meaningful, and Modifying Meaningful Data into Metrics was to address the age old question of ‘So What’? Or ‘What Do I Do About it’?

BI PlayBook offers readers real-world scenarios that I have solved using BI or data visualizations of sorts, but with the added bonus, of how to tie it back into the original business process you were reporting on or trying to help with BI, or tie back into the customer services/satisfaction process. This latter one is quite meaningful to me, because so often, we find our voices go unheard, especially when we complain to large corporations via website feedback, surveys or (gasp) calling into their call center(s). Feedback should be directly tied back into the performance being measured whether it is operational, tactical, managerial, marketing, financial, retail , production and so forth. So, why not tie that back into your business intelligence platforms using feedback loops and voice of the customer maps /value stream maps to do so.

Going one step further, having a BI PlayBook allows end users of your BI systems who are signed up and responsible for metrics being visualized and reported out to the company to know what they are expected to do to address a problem with that metric, who they are to communicate both the issue and the resolution to, and what success looks like.

Is it really fair of us, BI practitioners, to build and assign responisble ownership to our leaders of the world, without giving them some guidance (documented of course), on what to do about these new responsibilities? We are certainly the 1st to be critical when a ‘red’ issue shows up on one of our reports/dashboards/visualizations. How cool would it be to look at these red events, see the people responsible getting alerted to said fluctation, and further, seeing said person take appropriate and reasonable steps towards resolution? Well, a playbook offers the roadmap or guidance around this very process.

It truly takes BI to that next level. In fact, two years ago, I presented this very topic at the TDWI Executive Summit in San Diego (Tying Business Processes into your Business Intelligence). The PlayBook is the documented ways and means to achieve this outcome in a real-world situation.

To Start Quilting, One Just Needs a Set of Patterns: Deconstructing Neural Networks (my favorite topic de la journée, semaine ou année)

 

How a Neural Network Works:

Neural NetworkA neural network (#neuralnetwork) uses rules it “learns” from patterns in data to construct a hidden layer of logic. The hidden layer then processes inputs, classifying them based on the experience of the model. In this example, the neural network has been trained to distinguish between valid and fraudulent credit card purchases.

This is not your mom’s apple pie or the good old days of case-based reasoning or fuzzy logic. (Although, the latter is still one of my favorite terms to say. Try it: fuzzzzyyyy logic. Rolls off the tongue, right?)…But I digress…

And, now, we’re back.

To give you a quick refresher:

image

Case based reasoning represents knowledge as a database of past cases and their solutions. The system uses a six-step process to generate solutions to new problems encountered by the user.

We’re talking old school, folks…Think to yourself, frustrating FAQ pages, where you type a question into a search box, only to have follow on questions prompt you for further clarification and with each one, further frustration. Oh and BTW, the same FAQ pages which e-commerce sites laughably call ‘customer support’ –

“ And, I wonder why your ASCI customer service scores are soo low Mr. or Mrs. e-Retailer :),” says this blogger facetiously, to her audience .

 

 

 

And, we’re not talking about fuzzy logic either – Simply put, fuzzy logic is fun to say, yes, and technically is:

fuzzy logic

–> Rule-based technology with exceptions (see arrow 4)

–> Represents linguistic categories (for example, “warm”, “hot”) as ranges of values

–> Describes a particular phenomenon or process and then represents in a diminutive number of flexible rules

–> Provides solutions to scenarios typically difficult to represent with succinct IF-THEN rules

(Graphic: Take a thermostat in your home and assign membership functions for the input called temperature. This becomes part of the logic of the thermostat to control the room temperature. Membership functions translate linguistic expressions such as “warm” or “cool” into quantifiable numbers that computer systems can then consume and manipulate.)

 

Nope, we are talking Neural Networks – the absolute Bees-Knees in my mind, right up there with social intelligence and my family (in no specific order :):

–> Find patterns and relationships in massive amounts of data that are too complicated for human to analyze

–> “Learn” patterns by searching for relationships, building models, and correcting over and over again model’s own mistakes

–> Humans “train” network by feeding it training data for which inputs produce known set of outputs or conclusions, to help neural network learn correct solution by example

–> Neural network applications in medicine, science, and business address problems in pattern classification, prediction, financial analysis, and control and optimization

 

Remember folks: Knowledge is power and definitely an asset. Want to know more? I discuss this and other intangibles further in part 1 of a multi-part study I am conducting called:

weemee Measuring Our Intangible Assets, by Laura Edell

Investigative Analysis Part 1: Quantifying the Market Value of an Organization’s Intangible Asset Known as ‘Knowledge’

OK, so I’ve decided to conduct another multi-part study similar to what I did last year.

This time, I will be analyzing and attempting the quantify an organization’s intangible assets. Specifically, the following:

• knowledge, brands, reputations, and unique business processes

So, starting with knowledge:  Firstly, the chart is a little outdated but I will source the last two years and updated the graph later in the series.  Regardless, it is interesting none-the-less. And since I am the Queen advocate for measuring what matters and managing what you can measure, then consider the following my attempt to drink my own cool-aid – the following chart  depicts revenue growth over a 7 year period ending in 2008 – Those of you, my dear readers, who are also fellow Business Intelligence practitioners, should be able to attest at first glance to this statistical representation of Content Management Systems (CMS) and Portals YoY Revenue growth.

In fact, many of us have been asked to integrate BI dashboards and reports into existing corporate portals, like Microsoft SharePoint or into the native portals bundled with most Enterprise grade BI products like MicroStrategy or SAP/Business Objects, right? Many of us have been tasked with drafting data dictionaries, data governance documentation, source protected project and code repositories; ie – knowledge capture areas. But even in my vast knowledge (no pun intended), I was unaware that the growth spurt specific to CMS’ was as dramatic as this, depicted below and sourced from Prentice Hall

Laura Found This Interesting Folks!In fact, between 2001 and 2008, CMS’ revenue growth went from ~$2.5B to ~$22B, with the greatest spurt beginning in 2003 and skyrocketing up from there.

 

Conversely, the portal revenue growth was substantially less. This was a surprise. I must have heard the words SharePoint and Implementation more than any other between 2007 – 2009, whereas the sticker shock that came with an enterprise grade CMS sent many a C-level into the land of Nod, never to return until the proven VALUE cloud could ride them home against the nasty cop known as COST.

Aah – Ha moment, folks. Portal products were far less costly than the typical Documentum or IBM CMS.’

In fact, Jupiter’s recent report on CMS’ stated

“In some cases, an organization will deploy several seemingly redundant systems. In our sampling of about 800 companies that use content management packages, we discovered that almost 15 percent had implemented more than one CMS, often from competing vendors. That’s astounding, especially when you consider that an organization that deploys two content management systems can rack up more than $1 million in licensing fees and as much as $300,000 in yearly maintenance costs. Buying a second CMS should certainly raise a red flag for any CIO or CFO about to approve a purchase order.”

That’s 120 companies from the Jupiter study spending $1M in licensing, or $120M baseline. Extend that to all organizations leveraging CMS technology and therein lies the curious case of the revenue growth spurt.

To that, I say, Kiss My Intangible Assets! Knowledge is power, except when parked in someone’s head – Now, when will someone invent the physical drainage system for exactly said knowledge with or without permission of said holder? This gatekeepers need to go, and are often the dinosaurs fearing the newbie college grads and worst of all, CHANGE.

In part 2, we will discuss another fave of mine: Brand You!

Gartner VP Addresses Prerequisites for Developing Corporate Social Media Policies

Carol Rozwell might be my personal hero, well-respected and distinguished Gartner analyst. "Social media offers tempting opportunities to interact with employees, business partners, customers, prospects and a whole host of anonymous participants on the social Web," said the analyst and vice-president recently,  "However, those who participate in social media need guidance from their employer about the rules, responsibilities, ‘norms’ and behaviors expected of them, and these topics are commonly covered in the social media policy."

Gartner has identified seven critical questions that designers of social media policy must ask themselves:

What Is Our Organization’s Strategy for Social Media?
There are many possible purposes for social media. It can be used for five levels of increasingly involved interaction (ranging from monitoring to co-creation) and across four different constituencies (employees, business partners, customers and prospects, and the social Web). It is critical that social media leaders determine the purpose of their initiatives before they deploy them and that those responsible for social media initiatives articulate how the organization’s mission, strategy, values and desired outcomes inform and impact on these initiatives. A social media strategy plan is one means of conveying this information.

Who Will Write and Revise the Policy?
Some organizations assign policy writing to the CIO, others have decided it’s the general counsel’s job, while in other cases, a self-appointed committee decides to craft a policy. It’s useful to gain agreement about who is responsible, accountable, consulted and involved before beginning work on the policy and, where possible, a cross-section of the company’s population should be involved in the policy creation process. It’s important to remember that there is a difference between policy — which states do’s and don’ts at a high level — and operational processes, such as recruitment or customer support — which may use social media. These operational processes need to be flexible and changeable and adhere to the policy, but each department/activity will need to work out specific governance and process guidelines.

How Will We Vet the Policy?
Getting broad feedback on the policy serves two purposes. First, it ensures that multiple disparate interests such as legal, security, privacy and corporate branding, have been adequately addressed and that the policy is balanced. Second, it increases the amount of buy-in when a diverse group of people is asked to review and comment on the policy draft. This means that the process by which the policy will be reviewed and discussed, along with the feedback, will be incorporated into the final copy. A vetting process that includes social media makes it more likely that this will occur.

How Will We Inform Employees About Their Responsibilities?
Some organizations confuse policy creation with policy communication. A policy should be well-written and comprehensive, but it is unlikely that the policy alone will be all that is needed to instruct employees about their responsibilities for social media. A well-designed communication plan, backed up by a training program, helps to make the policy come to life so that employees understand not just what the policy says, but how it impacts on them. It also explains what the organization expects to gain from its participation in social media, which should influence employees in their social media interactions.

Who Will Be Responsible for Monitoring Social Media Employee Activities?
Once the strategy has been set, the rules have been established and the rationale for them explained, who will ensure that they are followed? Who will watch to make sure the organization is getting the desired benefit from social media? A well-designed training and awareness program will help with this, but managers and the organization’s leader for social media also need to pay attention. Managers need to understand policy and assumptions and how to spot inappropriate activity, but their role is to be more of a guide to support team self-moderation, rather than employ a top-down, monitor-and-control approach.

How Will We Train Managers to Coach Employees on Social Media Use?
Some managers will have no problem supporting their employees as they navigate a myriad of social media sites. Others may have more trouble helping employees figure out the best approach for blogs, microblogs and social networking. There needs to be a plan for how the organization will give managers the skills needed to confront and counsel employees on this sensitive subject.

How Will We Use Missteps to Refine Our Policy and Training?
As with any new communications medium, some initiatives go exceptionally well, while others run adrift or even sink. Organizations that approach social media using an organized and planned approach, consistent with the organization’s mission, strategy and values, will be able to review how well these initiatives meet their objectives and use that insight to improve existing efforts or plan future projects better.

More information is available in the report "Answer Seven Critical Questions Before You Write Your Social Media Policy," which can be found on the Gartner website at http://www.gartner.com/resId=1522014.

 

In addition, I wanted to add the following points:

I am all about the process – And a process for establishing a social media strategy (internal or externally facing) have several process steps which flow sequentially for the varying audience members who will consume or provide this information.

 

First, It is important to understand your corporate strategic goals. And even if social media isn’t explicitly defined, it is certainly an input to several common objectives like acquire/retain new/existing customers (Marketing), World-Class operations (real-time fodder is a great tool for customer service complaints in real time), etc.

Second, you need to functionally understand the impact domains and what purpose a social strategy will provide: which groups will be impacted by a social media strategy, and what, if anything are they already doing to address? Characteristics of a good purpose according to Carol Rozwell:

 

1. Magnetic
2. Aligned
3. Properly-scoped
4. Promotes Evolution
5. Low risk
6. Measurable
7. Community-driven

 

Third, connecting the corporate goals from the strategic plans to the social media purpose / strategy is key – that is what is defined by Aligned and Properly Scoped. All strategic plans evolve over time so why wouldn’t your social purpose evolve as well?

 

Fourth, Measurement. This is near and dear to my heart : Measuring what matters; business intelligence tools are starting to realize the value of offering real-time capabilities to track the chatter across the social sphere; think about my Wynn Hotel examples from previous posts to validate the power this can provide towards improving customer experience, and ultimately affecting long-term retention of your customers.

Community-driven is self-explanatory. You cannot tell a customer what their voice should be, it is what it is.

You as an organization need to understand that word of mouth from your customers is worth its weight in gold; more than the millions spent on advertising budgets and huge marketing campaigns. Communities offer the soap box that so many customers want to stand upon to share their experiences.

You reap the benefits of understanding this voice, and consuming this information in a meaningful and metrics driven approach that can provide context to your strategic goals without augmenting them with cost laden initiatives or proposals.

“LAURA” Stratification: Best Practice for Implementing ‘Social Intelligence’

Doing an assessment for how and where to learn social media to better understand your business drivers can be daunting, especially when you want to overlay how those drivers affect your goals, customers, suppliers, employees, partners…you name it.

I came up with this process which happens to mimic my name (shameless self-persona plug) to ease the assessment process while providing a guided assessment plan.

First, ‘Learn’ to Listen: learning from the voice of the customer/supplier/partner is an extremely effective way to understand how well you are doing retaining, acquiring or losing your relationships with those who you rely on to operate your business.

Second, Analyze what matters, ignore or shelve (for later) what doesn’t; data should be actionable, (metrics in your control to address), reporting key performance indicators that are tied to corporate strategies and goals to ensure relevancy.

Third, Understand your constituent groups; it isn’t just your customers, but also your shareholders, employees, partners, and suppliers who can make or break a business through word of mouth and social networking.

Fourth, Relate your root causes to your constituents value perceptions, loyalty drivers and needs to ensure relevancy flow through from step 2. Map these to your business initiatives and goals exercise from step 2. Explore gaps between initiatives, value perceptions, loyalty drivers and corporate goals.

Lastly, create Action plans to address the gaps discovered in Step 4. If you analyzed truly actionable data in step 2, this should be easy to do.

To apply this to social media in order to turn it into social intelligence, you need to make the chatter of the networks meaningful and actionable.

To do this, think about this example:

 

A person tweets a desire to stop using a hotel chain because of a bad experience. In marketing, this is known as an “intent to churn” event; when social intelligence reporting systems ferrets out this intent based on scouring the web commentaries of social networks, an alert can be automatically forwarded to your customer loyalty, marketing/social media or customer response teams to respond, address and retain said customer.

A posting might say “trouble with product or service” – That type of message can be sent to customer operations (service) or warranty service departments as a mobile alert.

And a “having trouble replenishing item; out of stock” question on a customer forum can be passed along to your supply chain or retail teams — all automatically.

The Wynn has a great feedback loop using social media to alert them in real-time of customers who are dissatisfied with their stay who Tweet or comment about this during their stay.

The hotel manager and response time will find this person to address and rectify the situation before they check out. And before long, the negative tweet or post is replaced by an even more positive response, and best of all, WORD of MOUTH to friends and family.

Its sad to say, in this day and age, we are often left without a voice or one that is heard by our providers of services / products. When good service comes, we are so starved that we rejoice about it to the would. And why not? That is how good companies excel and excellent companies  hit the echelon of amazing companies!

‘Social Intelligence’, the bridge between social networking and business intelligence, Starts To Build Momentum

Several years ago (in early 2009), I blogged about two of my passions, social networking and business intelligence. It was about the time that business folks starting building their profiles on LinkedIn, extending their networks via Twitter and started realizing that FaceBook wasn’t just a tool for their children to build their socialization skills but was a vehicle for networking with other professionals within and outside of their own personal networks. Grasping the power of the social network was still this abstruse almost arcane concept in its theoretical potential for corporate America. And while there were those visionaries, like the Wynn in Las Vegas, about whom I shared an anecdote within my TDWI presentation on Social Intelligence (one I will share in a moment) later that year, most companies saw social networking websites as distractions and often, banned them from use during the work day.

Why was Wynn different?

As a frequent corporate traveler, I have had many “check-in” line experiences: from the car rental counter to the hotel check-in line, I have had both good and bad experiences. On one somewhat lackluster experience, I was standing in line to check into the Wynn Hotel in Vegas. Several people ahead of me was a gentleman, fairly polished but obviously frustrated by his conversation with the desk clerk. As a highly perceptive observer (or at least, that is how I am spinning being nosy), I listened in on the situation. This gentleman had reserved a junior suite, since he and a colleague were sharing the room, a common occurrence as companies started to tighten their belts around corporate travel expenses. And, the suite was not available. The clerk seemed to want to help but was strapped by her computer system telling her no suites were available until the following night in the category booked. It turns out, she was new.

Quite gruffly, this gentlemen left the line, and proceeded to stand in the lobby, talking to his colleague about the disappointment, and commented that he was going to Tweet (post a message to his Twitter account) that buyer beware when it came to staying at the Wynn. Now, in a city like Las Vegas where capacity can exceed occupancy rates, combined with a name like the Wynn, combined with the sheer reach of a site like Twitter, this kind of negative word of mouth can really hurt a vendor. And more often than not, comments like this are over looked, or at least, were overlooked in the past, because of the lack of technology or reporting to alert such vendors to such disturbances in real time. In a travel situation, do you want to know that your issue was addressed after your trip with a gift and apology in the form of a coupon for choosing the stay there in future?

No…In fact, the breakage rate on such post-trip coupons is 70-80% (remember, I used to work for the largest online travel consortium) :). Thus, granting coupons is ineffective at winning the customer back. And it is because your trip, whether for business or pleasure, was ruined. And no, I am not being dramatic. You might not think a rooming issue can ruin a trip but it can. In fact, just being placed on the wrong floor or near an elevator or merely any event that is different that you were expecting can ruin a trip from a customer’s perspective.

But, I digress…

Back to my story: As soon as the customer finished posting his Tweet to Twitter, he turned to his colleague and walked to a cafe and sat down to order some refreshments. By the time I reached the front of the check in line, I noticed what appeared to be someone who appeared to be in charge (dark suit, name plate, piece of paper in his hand) approach the gentlemen and start a dialogue with him. Within moments, the two shook hands and the paper (which turned out to be room keys and an invoice) were swapped and the authority figure left about his business.

Intrigued, I walked up and asked the gentlemen what had happened. He was so excited by what had happened that he asked me to wait while he posted a note to Twitter. Since I had heard the original part of the story, I started to deduce what was happening. When he was finished, he said that gentleman was the hotel manager. He had been alerted to the room situation via a Twitter application which alerted management to travel disruptions as they occurred in real time to his smart phone. It was his job to make sure the customers were found in the hotel and the situation fixed to the betterment of the customer, no matter the situation. In this case, the customer was treated to an upgraded full suite, which was available, at no additional cost and given vouchers for the show that evening. The customer was so pleased, he had to go back to Twitter to recant his previous post, and to alert people to how well the situation was handled not days after the fact, but within the hour of it occurring.

I was floored.

You hear about the concept of the customer feedback loop but rarely do you see it implemented well or in a way that can affect overall customer loyalty or perception of the brand. In this case, it not only affected the customer and his colleague, but his entire social network.

Later, I found that same manager and asked his what he used to alert him to the Twitter incident from earlier.

He smiled and said we are in the business of pleasure, thus, it is our job to know when we fail. Alerting in real time is not as hard as you think with the right tools and technology. And left it at that.

Ok, so Vegas is a pretty secretive world of proprietary tools and technology, and are often market leaders when it comes to adoption.

And that is where Social Intelligence comes in: the ability to understand the Voice of the Customer as expressed within the intricate web of the social network via tools and technology. What better tools for alerting and reporting on incidents in real-time than those offered by the Business Intelligence suite of tools (at its most generalized state).

I am so happy to also report that in 2011, BI technology is taking an even larger footprint into the Social Intelligence space. When I can say more, I will. Just know I am really excited about the future ahead of us folks!

Happy New Year readers.

Applying the Bing Decision-Engine Model to “Business Intelligence” and Other Musings

Yes, folks, I am back. Wait, didn’t I write that before.

Well, after having my 1st child, I spent many months (just shy of 10, to be exact), noodling business intelligence, and the concepts I had previously discussed on my blog. For the last 5 years, I have been touting the need for better search integration, offering up the BI mashup concept before people really understood what a plain vanilla dashboard was, and was met by glazed stares and confusion. Now that folks are catching on to the iGoogle experience, and the ability to “mashup” or integrated several points of interest or relevance into a dashboard, I want to discuss this topic again. But, this time, I want to apply the concept of the Decision Engine instead of just the Search Engine when it comes to ways to make BI content more meaningful, more relevant and more useful to end users.

Side note: “mashup” is still not a recognized word in the spell-check driven dictionary lists for the greater population of enterprise applications.

Coupled with my mashup passion was my belief in eye-tracking studies. Eye-tracking measures the human behavior of looking at something and measuring the concentration of the eyes on a particular area of a particular object of interest, say a website for example. In the case of business intelligence, I applied eye-tracking studies to the efficacy of dashboard design in order to better understand the areas where the human brain focused concentration vs. those ignored (despite what the person might say was of interest to them).

Advertisers have known about eye-tracking studies for years, and have applied the results to their business. For example, the eyes will focus on the top left corner first. Whether a TV screen, a book, a piece of paper or a dashboard. It is the area of the greatest concentration. Therefore, special importance has been paid to the piece of advertising real estate. And since the popularity rise of folks like Stephan Few of recent or Edward Tufte, whose design principles for effective dashboard design have driven many a BI practitioner to rethink the look and feel of what they are designing, this concept of top left is more important has become commonplace.

And, the handful of other book grade principles have risen to the surface too: less is more when it comes to color, overuse of lines in graphs is distracting, save pie for desert (pie charts, that is), etc.  But tying it all together is another story all together. Understanding how human perception, visual perception and dashboard design meet is a whole other can of worms, and usually requires a specialized skill set to fully “grok” (sorry, but I love Heinlein’s work). :)

Excuse my digression…


Take a look at this image which shows eyetracking results from the three most popular search engines in 2006:

 

Notice the dispersion of color measured in the Yahoo and MSN examples vs. Google. This is correlated to the relevancy of the results and content presented on the page. And 4 years ago, Google’s search engine was a popular go-to tool for many when it came to finding related websites to help answer questions. Fast forward 4 years, and MSN is now Bing, and what was the search engine is now the dubbed “decision engine.”

The advent of the decision-engine in my eyes is because of the dilution of search engine effectiveness based on the flood of results presented to end users. In fact, I am sure the results of an eye tracking study from 2010 would be vastly different as a result of the exponential growth of web-based content available for crawling.

The same has occurred within enterprise business-intelligence platforms. What was introduced as powerful has really become inundated with content, in the form of reports, objects, dimensions, attributes, attribute elements, actual metrics, derived metrics and the list goes on and on.

Superficially, search was introduced as an add-on to the enterprise BI platforms. An add-on; really, an afterthought.

To the credit of the solutions on the market (grouped into a collective unit), people didn’t realize what they didn’t or better put, needed to know when building the technology behind their solution offerings. And they needed to start somewhere. It was only after BI became more mass-adopted in corporate America, and the need grew pervasive into even the smallest Mom and Pop shop for some level of reporting, that people began to realize that getting the visualizing the data was one thing; finding the results of those visualizations or data extractions was an entirely different can of worms.

At the same time as this was happening, the search giants started innovating and introducing the world to the concepts of real-time search and the “decision engine” named Bing. Understanding the statistical models behind how search algorithms work, even simplistically, understanding enough to be dangerous, is a key that any reader of this blog and any BI practitioner would be smart to invest their own time into doing. 

In a nutshell, my belief? Applying those principles and eons of dollars thrown at optimizing said models (by the search giants) is an effective way for BI solutions at any level to leverage the work done to advance search research and technology, instead of just patching BI platforms with ineffective search add-ons. Just look back at the Golden Triangle study graphic above, and remember that long before BI design experts like Tufte and Few said it, advertising gurus knew that the Top Left real estate of any space is the most important space to reach end users. So, instead of thinking of search as a nice add-on for your BI platforms, why not see it as a necessity. if a report is loaded into a repository and no one knows about it, was it ever really done? Let alone meaningful or valuable enough to be adopted by your end users? Think about it…

Gartner 2010 Business Intelligence Tools Magic Quadrant

For those of you who prefer not to register to receive this information, here is the 2010 Gartner Magic Quadrant rating the latest and greatest BI Platforms.

 

I love how many of the pure play newbies of last season like QlikTech moved from visionaries into the challengers role giving the big dogs on campus, MS, SAP/Business Objects and Oracle a run for their money. And while, value can be shown easily using a product which can consume and spit out dashboards as easily as making scrambled eggs in the morning, one has to wonder how much value it provides over time when the data to support such dashboards often still requires much manual intervention, ie. acquisition from source systems, cleansing, transformation and loading into a consumable format. Where’s the ROI in that? Most systems boast on the time savings achieved with implementation when calculating a BI system’s ROI.

 

But, not to knock them. I find them a great alternative for proof of concept work or when the manual nature of compiling the data isnt a concern or is someone’s role, and all that is needed is the icing to tell the cunning story (“Once upon a time, there was a SKU…And this SKU had many family members who lived in different houses in different regions of the world”)

 

Aah yes, if only all BI could be told as such a happy little anecdote of a story…A girl can wish can’t she?

Download Gartner 2010 Report Click Here

Report is also available in my SkyDrive library.

Anyone Else Notice the Over Usage of the Word ‘Dashboard’ / Lack of Principled Designs on the Web today ?

Doing a key-word search tonight using a variation of ‘dashboard, casino, marketing’ examples yielded two pages of the same link and a mix of other commentary type blogs. Granted, I am a blogger and find much value in the opinions of others. But, disenchantment quickly set in. It’s easy to comment on the designs of others. Crafting strategic KPIs in a way that can easily cascade down into tactical management dashboards and ultimately, down into operational reports in order to appeal to the broad audience of C-levels, middle managers and individual contributors is challenging. The data model and underlying ETL processes, the storage mechanisms, the network capacity, even the power of your box (both physical and virtually, speaking) can plague a dashboard’s performance. But what about design?
I am a huge fan of Stephen Few, having sat in several all day classes as well as assisting in coordinating his presence at a local TDWI NW Chapter as keynote speaker. His principles are simple to understand and visually, appeal to the broad and hidden nature of our visual cortex’ response to stimuli – What?
Our brains play a key part of the adoption of dashboards; plain and simple. The better designs in the world take into account everything from eye tracking on a screen, to real estate / importance placement of items on the screen, to color response cues (red – stop; green – go) and more importantly, our innate ability to shut down response when we are inundated with color / graphics / text / information. While everyone has stepped onto the bus and started down Dashboard Drive, I have to wonder when and where this road will stop, and the next latest and greatest will move into it’s place. Beware, oh reader; don’t be swayed by the flash and glitz of the sales presentation. Oh yeah, and, be wowed by sparklines and bullet graphs (here: http://www.perceptualedge.com/blog/?p=50) and their ability to relay trend, target, actuals in a simple line graph or horizontal bar chart using varying grey tones and a hint of red, without adding traffic lights or gauges (the bains of my existence, friends will the ill fated over-used pie chart).

Balanced Scorecard Collaborative Annual Performance Management Summit 2010 – Presenter Experience

Sorry for my delay. Much of my energy these days is going into the productization of my consulting experience around building dashboards and scorecards that are linked through common, meaningful KPIs (key performance indicators) along vertical channels. It always surprises me which industries seemingly, from an outside PoV, would have cunning measurement tools in place from those that would not. Why? I am often wrong in my presumptions. I think profitable company means some extraneous $$ available to invest in BI or other data visualizations, because, naturally, it is where I would invest it if I were Company X’s CEO. :) RIGHT! This is not the typical use case. Even in industries who fall into the highly regulated / compliance driven field still use spreadsheets or Access dbs (or as Wayne E. calls them, spreadmarts) to house/compile/report their data. Tomorrow, May 12, I am speaking at the annual BSCol Performance Management summit on designing closed feedback loop systems to address scorecard events and was reminded, dear readers, that we have come a long way since I started this blog many years ago, but have a long way to go before companies and ourselves, frankly, fully grasp the best ways to incite leaders with excitement and understanding about how to drive business initiatives and strategic objectives meaningfully instead of instinctually, where middle managers and line workers will have their contribution to those objectives linked or cascaded down into; where reporting and dashboarding tools will offer simplicity and ease of use developing the supporting data model to facilitate such drill downs, an often overlooked requirement when building scorecards or dashboards that "drill-down" into reports with that fine layer of granularity; where the same platform has the ability to run complex statistical modeling techniques using the same datasets and where data visualizations become more than the latest Flash based dashboard that often carries more sex appeal during demo that one can ever extract from it post-acquisition. Aah yes, those days will be nice. One day, I believe it will happen. A girl can dream, right…?

Data Visualization: Looking vs. Seeing

For many years vision researchers have been investigating how humans use theiw own visual cortex and other perceptions based systems to analyze images. An important initial result was the discovery of a very small subset of visual properties detectable very quickly & for a large part, very accurately, by the lowest of these systems, aptly referred to as low-level visual system.

These properties were initially called “preattentive”, since their detection seemed to come before one actually focused their attention.

Since then, we have a better understand. As it stands today, attention plays a critical role in what we see, even at this early stage of vision. The term preattentive continues to be used, however, since it conveys the speed and ease of visual identification of these properties.

Anne Treisman determined two types of visual search tasks: 1 which is preattentive known as Feature search, and the other which requires conscious attention or Conjunction search. Feature search can be performed fast and pre-attentively for targets defined by primitive features. 

The features or properties can be broken into 3 areas: color, orientation and intensity.

And as you might have guessed, Conjunction search is slower and requires the participant’s full attention, something we humans have a hard time giving in certain situations, which is only worsening with the advent of hand held devices and other mobile smart phones to distract us. 

“Typically, tasks that can be performed on large multi-element displays in less than 200 to 250 milliseconds (msec) are considered preattentive. Eye movements take at least 200 msec to initiate, and random locations of the elements in the display ensure that attention cannot be prefocused on any particular location, yet viewers report that these tasks can be completed with very little effort. This suggests that certain information in the display is processed in parallel by the low-level visual system.” (“Perception in Visualization by Christopher Healey”)

What does this mean: well, given a target, say a red circle, and distractors being everything else, which is this case are blue objects,one can quickly see in this example which is which, i.e. in < 200 msec, you can glance at these two pictures and define the target from the distractors, right?

  As in this example, it seems introducing preattentive cognition to dashboards would result in a healthy and loving relationship and one when carried over time (ie – employed by BI practitioners during design phase of any BI / data visualization project) would result in more meaningful, & less cluttered dashboards, right? 

Now, think about your dashboards and BI visualizations – Think about how many of them tell a good and clean story, where the absolute most important information “pops” out to end viewer. One requiring little explanatory text, contextual help or other mechanisms we BI practitioners employ to explain our poorly designed dashboards. And, I am by no means claiming everything I have designed to be fault free– We all learn as we go. But I can say that those designs of today vs. yesterday are better because of my understanding of visual perception, neural processing / substrates and cognitive sciences and how to apply these fields to business intelligence in order to drive better data visualizations.

Why is it that some who work in BI think the more gauges or widgets pushed into a screen, the better?

Instead, I contend that the application of this principle to dashboard design, report design, website design or any type of design would point out that much in our world today is poorly designed, fitting with non complementary colors, over use of of dristractor objects, thus, rendering the user confused or “distracted” from the target object, which could be something as important as revenue of a company, or number of death in an ER wing of a hospital, both of which so important as one might question how such numbers could get lost.

 

Try it for yourself by reading anything by Stephen Few or Edward Tufte as a starting place.

SPModule and Sharepoint 2010; the power of the PowerShell

 

Check out my Google sidewiki post here: http://www.google.com/sidewiki/entry/flowergirldujour/id/Swp7WYEPjDdbMeaYQkPn8Hir4MY

And definitely download this module so that you can use SPModule…

SPModule.zip

Here is what the interface looks like:

From Zach Rosenfield’s blog, who is the Sharepoint Program Manager at Microsoft:

SPModule.HelloWorld()

Welcome to the introduction of SPModule.  SPModule is a Windows PowerShell module written by members of the SharePoint Product Group in our spare time.   SPModule is an example of how we would envision you accomplish various common tasks within Windows PowerShell in a SharePoint 2010 environment.  We hope to position various best practices from these scripts and we hope in the long term to reference these also within technet.  These blog posts serve simply as our first location of sharing them, and this post will be updated once we have the samples hosted within technet.  The scripts themselves are not officially supported, but we will entertain questions and suggestions through this blog until we get it onto technet. 

How do I get started?

First download the zip that contains the scripts from here:

http://sharepoint.microsoft.com/blogs/zach/Script%20Library/Modules/SPModule/SPModule.zip

Next, unpack the zip file onto a share in your environment.  Before you get to use the scripts, you’ll need to make a decision around signing.  By default, Windows PowerShell is configured securely such that it will not run unsigned scripts.  You can choose to either sign them yourself with a self-signed certificate or run Windows PowerShell in a mode where you do not verify signatures.  We do not recommend running Windows PowerShell in this state.  However if you are in an isolated environment, you may choose to do so.  If you follow my last post about signing files, you can use those instructions to sign the entire “Module” in a single command:

function Add-Signing($file){ Set-AuthenticodeSignature $file @(Get-ChildItem cert:\CurrentUser\My -codesigning)[0] }

ls -r -Include ("*.ps1","*.psd1","*.psm1") |%{ Add-Signing $_ }

Please note that if you have not installed SharePoint, then you need to lower the Execution Policy to “AllSigned” using this command: Set-ExecutionPolicy AllSigned. This is done by installing the SharePoint bits so if you’ve already installed this is not needed.

Then open Windows PowerShell as an administrator (right click on the link and select “Run as administrator”).  If you already have SharePoint 2010 installed, you could use the SharePoint 2010 Management Shell instead.  Once the window opens, the first thing we need to do is add the path to the module to your Windows PowerShell module path (presuming you created a folder called “SPModule” on your server):

$env:PSModulePath = “C:\SPModule;” + $env:PSModulePath

Next we need to import the modules:

Import-Module SPModule.misc

Import-Module SPModule.setup

When you import the SPModule.misc module, you will invoke a update check.  In 1.0, this will check a file in the script library above to see if there is a newer version available.  If you are notified that there is, you can go to that location and download the newer version.  Once the Import-Module commands are done, you’re ready to use SPModule.

So, what does SPModule give me?

The 1.0 version of SPModule provides a few major new commands and a number of smaller supporting commands.  Here’s how you can get the list of commands in the module:

Get-Command –Module SPModule.*

The major commands of 1.0 are Install-SharePoint, New-SharePointFarm, Join-SharePointFarm, and Backup-Logs.  They do exactly what their names would lead you to expect (Backup-Logs collects all the logs on the local machine not the whole farm). The rest are for more advanced scenarios or are used by these larger functions—please be careful using commands you don’t understand  Here’s some quick examples to get you started:

Install SharePoint Bits (including Prereqs) on a

Install-SharePoint -SetupExePath “\\servername\SharePoint2010-Beta\setup.exe” -PIDKey “PKXTJ-DCM9D-6MM3V-G86P8-MJ8CY”

New-SharePointFarm –DatabaseAccessAccount (Get-Credential DOMAIN\username) –DatabaseServer “SQL01” –FarmName “TestFarm”

Join-SharePointFarm -DatabaseServer “SQL01” -ConfigurationDatabaseName “TestFarm_SharePoint_Configuration_Database”

Backup-Logs -outp “$env:userprofile\Desktop\SharePointLogs.zip”

Note:  Backup-Logs may have trouble putting the subzip files into the final zip.  We are aware of this issue and are working on this for the next release.  For now, we will detect the situation and keep the subzip files that had a problem”

Talking about Bayesian inference, Data Visualization and More

 

Quote

Bayesian inference – Wikipedia, the free encyclopedia

what is your take on using Bayesian inference to determine website behaviors of consumers of your goods and services? And once established, and quantifiable metrics gleaned on actual behaviors, do you go back and adjust your hypothesis with more objective data? Most do not, so don’t fault yourself. If you even tried to apply statistical methods to “gut feel” hypothesizes, hats off to you!

The power within this model can also be extended to visualizing your statistical models as well.

Check out what Twitter users were ‘tweeting’ about during the Super Bowl as visualized by the NY Times:

Twitter Chatter during SuperBowl, NY Times

 

Laura Edell Google SideWiki

Talking about Getting Started \ Processing 1.0

 

Quote

Getting Started \ Processing 1.0

Gotta ask my audience for commentary on this one…How many of you are using Processing 1.0 environment/language to build your complex data visualizations?

Processing.org quotes it as "Processing is a simple programming environment that was created to make it easier to develop visually oriented applications with an emphasis on animation and providing users with instant feedback through interaction." (http://processing.org/learning/gettingstarted/)

I have been using this app since college and being a BI professional services/developer now, I tend to overlook the simplicity and ease of use of the Processing language, functions and environment (PDE).

Has anyone else used it for building data visualizing?

Does Google SideWiki rock? Let’s find out…

 

I posted a wiki to my company’s home page (http://www.google.com/sidewiki/entry/100318308419716752076/id/bjzCPgiw49UzmckRuIiLBRtmh4Y ) as a test of Google SideWiki – An avid fan of the Google app suite (though still an Apple girl at heart) , I thought I would try SideWIki today. Let’s see how well it links my social networks together from a SEO perspective. As we all know in the "social intelligence" space, linking up social profiles is the "FREE" way to get higher search engine traction between our profiles and our social network pages/spaces. 

Welcome to Mantis Technology Group

Respect Paid to Pitney Bowes – How PBBI Turned This Blogger’s Opinion Around

I posted the Gartner Magic Quadrant last week (Gartner Magic Quadrant for Data Integration – Delta Comparing 2007-2009) and commented my opinion on the choices for winners and losers (big surprise, but really, I was shocked).

For one, BI practitioners tend to believe they are experts at their domain, and rightfully so, if they are good at what they do and have been doing it for a few years. In my case, 11 years of my life have been spent learning, upgrading, relearning and immersing myself in business intelligence tools and platforms.

So, this year, I was surprised by their ETL quadrant because of Pitney Bowes – Here is my comment:

[Laura Edell comment] Ummm, I thought Pitney Bowes provided corporations with stamps and other business-related supplies…How does one leap from that genre to not just business intelligence, but data integration…? Maybe to compete with the former Business Objects Data Quality Zip Code Cleanser? j/k – but I thought that was eye catching enough to call out.

A little while later, I was most surprised to receive an email from Pitney Bowes’ VP of Communication following up on my comment in a most professional manner. He also offered to chat further about their PBBI solution and walk me through the history and evolution of the PBBI product stack:

Here is a list of ETL / BI related links from the Pitney Bowes site @ http://pbinsight.com/solutions/by-business-need 

Improve Operational Efficiency

Automated Address Management for Improving Operational Efficiency

Kudos, Pitney Bowes! I stand corrected!

Troubleshooting Apple Permissions Issue: MacBook Pro OS X 10.5.8 Grey Screen Spawned This Posting (again)

 Thought this support article from Apple’s KnowledgeBase website was really helpful and wanted to share…

Troubleshooting permissions issues in Mac OS X

Content from article:

Summary

Learn about the concept of permissions (or "privileges") in Mac OS X, issues that can arise due to incorrect permissions settings, and how to troubleshoot them.

Products Affected

Mac OS installation/setup (any version)

Using the Repair Privileges Utility

Most users of Mac OS X have not intentionally modified privileges and simply need a utility to reset system privileges to their correct default values. If you have Mac OS X 10.2 and later, this utility is included in the operating system. If you have Mac OS X 10.1 you can download it. For versions 10.0 to 10.1.4, you must update to version 10.1.5 first.

For Mac OS X 10.2 or later, open Disk Utility (/Applications/Utilities/). Select your Mac OS X startup volume in the column on the left of the Disk Utility window, then click the First Aid tab. Click the Repair Disk Permissions button. You may see an erroneous message.

If you have modified the contents of the folder /Library/Receipts, the Repair Permissions feature won’t work as expected. Repairing permissions requires receipts for Apple-installed software. Additionally, the utilities only repair Apple-installed software and folders (which does not include users’ home folders).

The remainder of this document contains more advanced information.

Note: In Mac OS X 10.5 and later, while started up ("booted") from the Mac OS X 10.5 installation disc, a user’s home directory permissions can be reset using the Reset Password utility.

Warning: This document describes how you may modify permission settings by entering commands in the Terminal application. Users unfamiliar with Terminal and UNIX-style environments should proceed with caution. The entry of incorrect commands may result in data loss and/or unusable system software. Improper alteration of permissions can result in reduced system security and/or exposure of private data.

Permissions Defined

Mac OS X incorporates a subsystem based on a UNIX-style operating system that uses permissions in the file system. Every file and folder on your hard disk has an associated set of permissions that determines who can read, write to, or execute it. Using the AppleWorks application and one of its documents as an example, this is what the permissions mean:

  • Read (r–)
    You can open an AppleWorks document if you have the read permission for it.
  • Write (-w-)
    You can save changes to an AppleWorks document if you have the write permission for it.
  • Execute (–x)
    You can open the AppleWorks application if you have the execute permission for it.

    Also note that you must have execute permission for any folder that you can open; thus File Sharing requires execute permission set for other, world, and everyone for the ~/Public folder, while Web Sharing requires the same setting for the ~/Sites folder.

When you can do all three, you have "rwx" permission. Permissions for a folder behave similarly. With read-only permission to a folder containing documents, you can open and read documents but not save changes or add new documents to the folder. Read-only (r–) permission is common for sharing files with guest access, for example.

Owner, Group, Others

Abbreviations like "rwx" and "r-x" describe the permission for one user or entity. The permissions set for each file or folder defines access for three entities: owner, group, and others.

  • Owner – The owner is most often the user who created the file. Almost all files and folders in your home directory will have your username listed as the owner.
  • Group – Admin users are members of the groups called "staff" and "admin". The super user "root" is a member of these and several other groups. Non-admin users are members of "staff" only. Typically, all files and folders are assigned to either "staff," "admin," or "wheel".
  • Others – Others refers to all other users that are not the owner or part of the group for a file or folder.

Since each entity has its own permission, an example of a complete permission set could look like "-rwxrw-r–". The leading hyphen designates that the item is a file and not a folder. Folder privileges appear with leading "d," such as "drwxrw-r–". The "d" stands for directory, which is what a folder represents. Figure 2, below, depicts how this looks in the Terminal application.

Abbreviating permissions as numerals

After a while, you might think that "-rwxrwxr-x" is a lot to type. And you’d be right. That’s why there’s a simple way to abbreviate permissions as numerals, ranging from 777 (-rwxrwxrwx) down to 000 (no access). An "rwx" becomes a 7, the sum of 1, 2, and 4, where 4=Read, 2=Write, and 1=Execute. A zero means no access. Each of the three numerals is the sum of permissions for Owner, Group, and Other, respectively. Thus our example of "-rwxrwxr-x" becomes 775.

Example: Creating a TextEdit document

Suppose you create a TextEdit document and save it in the Documents folder of your home directory. The document has privileges of "-rw-r–r–", so you can read and write to the file; but the assigned group and any other users can only read it. Because you saved the file in your Documents folder (drwx——), the group and other users cannot even see your file. The enclosing folder’s permissions effectively supersede the file’s own permissions. This is how the home directory structure of Mac OS X provides privacy. If you drag the file to your Public folder (drwxr-xr-x) and log out, another user could log in to the computer and read your public file.

Default settings for new files and folders

Ownership settings

  • User is the user that creates the new file or folder.
  • Group is default group of the user who created the file or folder.

Permissions

  • Folders or directories: drwxr-xr-x
  • Files: -rw-r--r--

Root: The "Super User"

In Mac OS X, a super user named "root" is created at time of system installation. The root user has complete access to all files and folders on the computer, as well as additional administrative access that a normal user does not have. In normal day-to-day usage of your computer, you do not need to log in as the root user. In fact, the root user is disabled by default.

Issues Related to Permissions

Incorrect permission settings may cause unexpected behavior. Here are several examples with troubleshooting suggestions:

  • Application installers, Applications folder
    A third-party application installer incorrectly sets permissions on the files it installs, or even the entire Applications folder. Symptoms of the Application folder’s permissions being set incorrectly include applications appearing in the dock as question marks, and/or not being able to connect to the Internet. It is also possible that software installed while logged in as one user will be inaccessible when logged in as another. To avoid this, make sure you are logged in with your normal user account when installing software that you wish to use with that account.
     
  • Files created in Mac OS 9
    Files created in Mac OS 9 may appear in Mac OS X with root ownership. When you start up in Mac OS 9 on a computer that also has Mac OS X installed, you can see, move, and delete all files, giving you the equivalent of root access. For this reason it’s a good idea not to move or open unfamiliar files or folders when started up in Mac OS 9.
     
  • Power interruption
    The file system may be affected by a power interruption (improper shutdown) or when it stops responding (a "hang" or "freeze"). This could affect permissions. You may need to use fsck.
     
  • Software access=user access
    Most applications executed by a user only have access to the files that the user has access to. Backup software, for example, may not back up Mac OS X system files that have root ownership.
  • Emptying the Trash
    In some circumstances, folders for which you do not have write permission can end up in the Trash; and you will not be able to delete them or the files contained in them. Remember that in Mac OS X there is not a single Trash folder. Instead, each user has a Trash folder in their home directory (named ".Trash"). There is also a Trash folder for the startup volume, and Trash folders for other volumes or disks. When a user throws away a file on a local non-startup volume, the name of the folder on that volume is "/.Trashes/UID", where UID is the user ID number of the user (which may be seen in NetInfo Manager). In either case, all Trash folders are hidden from the user in the Finder. In these situations you can either start up into Mac OS 9 to locate the files and delete them, or you can use the Terminal application. Issues with emptying the Trash are much less likely to occur in Mac OS X 10.2 or later, since the Finder empties the Trash as the root user. However, issues may still occur with files on remote volumes for which your local root user has no special privileges.

Warning: Typographical error or misuse of the "rm -rf" command can result in data loss. Insertion of a space in the wrong place could result in the complete deletion of data on your hard disk, for example. You may wish to copy and paste the commands below into a text editor to verify spacing. Follow these steps to delete Trash for the logged-in user:

  1. Open the Terminal application.
  2. Type: sudo rm -rf
    Note: Type a space after "-rf". The command does not work without the space. Do not press Return until Step 6.
  3. Open your Trash.
  4. Choose Select All from the Edit menu.
  5. Drag all of your Trash into the Terminal window. This causes the Terminal window to automatically fill in the name and location of each item in your Trash.
  6. Press Return.

All of the items in your Trash are deleted. As an alternative method, you may execute these commands. The second and third commands will delete Trash belonging to other users. The commands are:

Warning: Typographical error or misuse of the "rm -rf" command can result in data loss. Insertion of a space in the wrong place could result in the complete deletion of data on your hard disk, for example. You may wish to copy and paste the commands below into a text editor to verify spacing.

Important: There is no space between "/" and ".Trash" or ".Trashes" below.

sudo rm -rf ~/.Trash/
sudo rm -rf /.Trashes/
sudo rm -rf /Volumes/<volumename>/.Trashes/

Note: To end the sudo session, you should either execute the exit command, or log out of Mac OS X and then log back in.

Respectively, this permanently deletes all files in the current user’s Trash, the startup volume Trash, and the Trash for other volumes (if any). These commands cannot delete locked files. You have to unlock them first.

Note: The sudo command can be used to temporarily obtain super user status and change permissions on files that otherwise could not be changed. However, it is only available if you are logged in with an administrator account, and it requires an administrator account user password for authentication.

How to View and Change Permissions in the Finder’s Info Window

The Mac OS X Finder can be used to inspect and modify permissions settings for some files and folders. You can only change permissions for files and folders of which you are the owner. This can aid in troubleshooting permissions-related issues. To view and change permissions in the Info window, follow these steps:

  1. Select a file or folder in the Finder.
  2. From the File menu, choose Show.
  3. Choose Privileges from the pop-up menu in the Info window.
  4. Using the pop-up menus, change permissions settings as necessary (Figure 1).
  5. Optional: If you are changing permissions for a folder and you want the changes to apply to enclosed folders as well, click Apply. Apply only appears when you show info for folders.

Note: Changes made using the Info window take effect as soon as they are made, even before closing the window.


Figure 1 Privileges in the Info window

Viewing and Changing Permissions With Terminal

The Terminal application is located in the Utilities folder in the Applications folder. You can use Terminal to inspect or change permissions. Unlike the Finder’s Info window, the sudo command gives you the convenience of root access without having to log out and back in as root.

Warning: Basic knowledge of the command line is required to utilize this tool. Data loss and/or unusable system software may result if the Terminal application is used improperly.

To determine the permissions settings for files or folders, open Terminal and navigate to the directory where the file or folder is located. Then execute the command "ls -l". The output resembles that in Figure 2.


Figure 2 Viewing permissions with Terminal

In the Figure 2 example, any user can read "File Name1.ext", because the read bit (r) is set for others. But the file is only changeable by root because the write bit (w) is only enabled for the owner, which is root. If the file is not a system file and you would like to be able to modify it from your normal account, you could change the owner with the following command:

sudo chown yourusername "File Name1.ext"

The file is owned by root, not by the user logged in, so the "sudo" command gives you temporary root access. Replace yourusername with your account’s short name.

Space syntax: Be careful when typing spaces in file paths within the Terminal. In the example, the filename is enclosed in quotation marks because it contains a space. Alternatively, you can replace spaces with a backslash followed by a space. Without the quotation marks, the same command would be typed as:

sudo chown yourusername File\ Name1.ext

For more information on changing ownership, groups, and permissions, see the man (manual) pages for chown, chgrp, and chmod. You access man pages by executing "man <command_name>". For example:

man chmod

By default, man pages are displayed one at a time. To read the next page, press the Space bar. To exit the man page, press Q.

Gartner Magic Quadrant for Data Integration – Delta Comparing 2007-2009

We finally have an open source tool in the Gartner Magic Quadrant (source: Gartner Group) for Data Integration while IBM and Informatica keep a big lead.

Gartner have been modifying the inclusion criteria across most of its magic quadrant –criteria:

  • They must generate at least $20 million of annual software license revenue from data integration tools or maintain at least 300 maintenance-paying customers for their data integration tools.
  • They must support data integration tools customers in at least two of the major geographic regions (North America, Latin America, Europe and Asia/Pacific).
  • They must have customer implementations that reflect the use of the tools at an enterprise (cross-departmental and multiproject) level.

There are not many software vendors out there who are charging as hard at data integration as IBM and Informatica and it shows in the latest quadrant where IBM and Informatica are further in front of the competition.

The diagram on the left shows the moves from 2007 to 2008 and 2009 with the light green line being 2007 to 2008 and the dark green being 2008 to 2009.  You can see that IBM made a big move in the 2008 quadrant and has hovered while Informatica have made big moves in each year. 

The Losers

There are no real winners and losers, there are just those companies that will give the quadrant away for free, those who will just mention it in a press release and those who will pretend it doesn’t exist.  Among those who will pretend it doesn’t exist:

- Sun Microsystems and TIBCO have been dumped from the quadrant because they don’t sell ETL style data integration tools any more.  There are a couple purple lines on the diagram from where those vendors used to be.

- ETI and Open Text are in free fall and won’t be staying in the quadrant much longer.  ETI are an old school data integration product who never got the hang of the modern trend for visual design interfaces and data integration suites.

- SAP Business Objects are still in the leader quadrant – but only just.  Does not bode well for the SAP acquisition of Business Objects.

The Winners

- Obviously Informatica and IBM are the clear winners with Informatica making a big move based on recent acquisitions and releases such as Informatica 9 and the Business Glossary.

- Oracle jump into the leader square and can blow raspberries back at Microsoft.

- Talend are the first open source data integration vendor to get into the quadrant, though they would argue (at great length) that they should have been included last year and the quadrants move too slowly in the fast paced world of business software.

- Syncsort and Pervasive keep plugging away and improving and keeping costs of software beneath those of the market leaders.

[Laura Edell comment] Ummm, I thought Pitney Bowes provided corporations with stamps and other business-related supplies…How does one leap from that genre to not just business intelligence, but data integration…? Maybe to compete with the former Business Objects Data Quality Zip Code Cleanser? j/k – but I thought that was eye catching enough to call out.

Talking about Apple – Grey Screen of Perpetual Death – Macbook Pro won’t start up

Very interesting fact I wanted to share – While my blog has been centered on Windows based applications, usually in the BI space, I converted to Mac several years ago personally, and now, there is no going back. However, the recent security and firmware updates to Mac OS X (10.5.8 for me), has caused much to do about nothing (headaches to say the least). For those non-Unix folks out there, I thought I would post the work around to fixing that pesky grey bootup screen with the Apple logo we usually love and the spinning wheel signifying the system’s attempt to boot through the processes. This neverending cycle is infuriating to say the least. You will need to log into Terminal to run Unix-based commands natively in order to fix.

The root cause fixed by this work around has to do with changing permissions on the MacIntosh HD. I thought changing the <Everyone> group was a smart thing…Little did I know then!

Well, looks like your fixing to get some more command line experience.

If you unknowingly/knowingly changed permissions to "everyone no access"
here is the process to address:
restart the computer

When you hear the chime, hold down the Command and S keys BEFORE the Apple logo appears

Enter the following commands:

/sbin/mount -uw /
(press return)
/bin/chmod -R o=r,+X /
(press return)
/usr/sbin/chown root:admin /
(press return)
/bin/chmod 1775 /
(press return)
/bin/chmod -N /
(press return)
reboot
(press return)

The strange thing with each line of command, it has interupted me by a flood of line after line and telling me it "operation not permitted" after each line, then at the very end "bad file discriptor".
I tried the /sbin/fsck -fy and one more command that was on the screen, then exit (it was not something you had in your notes), It scrolled through several things then rebooted and FINALLY  my login screen! Be patient.

 Email me at laura.gibbons@me.com for more assistance.

TDWI NW Chapter – Next Chapter Meeting “Where BI Meets the Business: Driving business context into BI solutions”

Registration is limited and FREE! Register here today: http://tdwichapters.org/pages/seattle/event-registration.aspx

Check us out on TDWI at http://www.tdwi.org/northwest

When:

Wednesday, January 27th, 5:30 – 8:30pm

Where:

Microsoft
Lincoln Square, Floor 15
700 Bellevue Way NE
Bellevue, WA 98004

PARKING AVAILABLE IN BUILDING. Free exit after 8pm.

 

Sideline Comparative Predictions: Gartner’s 2010 Technology Trends

As promised in my last blog post, here is the comparison list of predictions for Top 10 Strategic Technologies for 2010 – I’ve highlighted the 2nd trend on Gartner’s list for this drum has been beaten by yours truly for years now, only to be shot down and sometimes embraced for said belief that advanced analytics are where the value within BI truly lay, and those who adopt now, will beat the curve of the trend and reap the ill-gotten rewards long due to companies who have invested millions into BI programs without realizing much gain (*no matter which service implementer was used, although {shameless plug} if reader had used Mantis Technology Group (my company), this would be moot as you would be reveling in the realized value that we bring since yours truly, is an employee and implementer of these very BI systems). When it comes to the broad realm of BI or the facets within BI, like social intelligence *(another prediction)*, advanced analytics or cloud computing *(yet another prediction)*, Mantis excels at infusing value into even the smallest scale implementation – Having come from being the client to now the service provider, I have worked with the very largest and those that claim to be the best, down to the niche providers like ourselves on a slightly bigger scale…I say with all earnestness that Mantis’ offering truly stands above those in both spaces that I had previously hired, often left with that disappointing feeling when one realizes that they did not get what they expected, and when they confront those who provided the end result they got, often being lead down the “lets get the SOW and look at what you asked for route” which never ends well…Clients, such as myself in my former life, often don’t know what they don’t know especially when implementing technologies that may not be something they are well versed – As I have belabored and will do again quickly now, it is up to the service provider to hang up their $$ hat and help the client understand enough to be dangerous and make educated choices, not just those that will return the greatest financial gains, but those that will truly help deliver on the value proposition that IS POSSIBLE from well implemented BI programs.

As said before, please share your predictions, comments or anecdotes with our readership. I (we) would love to hear your opinion too!

The top 10 strategic technologies as predicted by Gartner for 2010 include:

Cloud Computing. Cloud computing is a style of computing that characterizes a model in which providers deliver a variety of IT-enabled capabilities to consumers. Cloud-based services can be exploited in a variety of ways to develop an application or a solution. Using cloud resources does not eliminate the costs of IT solutions, but does re-arrange some and reduce others. In addition, consuming cloud services enterprises will increasingly act as cloud providers and deliver application, information or business process services to customers and business partners.

Advanced Analytics. Optimization and simulation is using analytical tools and models to maximize business process and decision effectiveness by examining alternative outcomes and scenarios, before, during and after process implementation and execution. This can be viewed as a third step in supporting operational business decisions. Fixed rules and prepared policies gave way to more informed decisions powered by the right information delivered at the right time, whether through customer relationship management (CRM) or enterprise resource planning (ERP) or other applications. The new step is to provide simulation, prediction, optimization and other analytics, not simply information, to empower even more decision flexibility at the time and place of every business process action. The new step looks into the future, predicting what can or will happen.

Client Computing. Virtualization is bringing new ways of packaging client computing applications and capabilities. As a result, the choice of a particular PC hardware platform, and eventually the OS platform, becomes less critical. Enterprises should proactively build a five to eight year strategic client computing roadmap outlining an approach to device standards, ownership and support; operating system and application selection, deployment and update; and management and security plans to manage diversity.

IT for Green. IT can enable many green initiatives. The use of IT, particularly among the white collar staff, can greatly enhance an enterprise’s green credentials. Common green initiatives include the use of e-documents, reducing travel and teleworking. IT can also provide the analytic tools that others in the enterprise may use to reduce energy consumption in the transportation of goods or other carbon management activities.

Reshaping the Data Center. In the past, design principles for data centers were simple: Figure out what you have, estimate growth for 15 to 20 years, then build to suit. Newly-built data centers often opened with huge areas of white floor space, fully powered and backed by a uninterruptible power supply (UPS), water-and air-cooled and mostly empty. However, costs are actually lower if enterprises adopt a pod-based approach to data center construction and expansion. If 9,000 square feet is expected to be needed during the life of a data center, then design the site to support it, but only build what’s needed for five to seven years. Cutting operating expenses, which are a nontrivial part of the overall IT spend for most clients, frees up money to apply to other projects or investments either in IT or in the business itself.

Social Computing. Workers do not want two distinct environments to support their work – one for their own work products (whether personal or group) and another for accessing “external” information. Enterprises must focus both on use of social software and social media in the enterprise and participation and integration with externally facing enterprise-sponsored and public communities. Do not ignore the role of the social profile to bring communities together.

Security – Activity Monitoring. Traditionally, security has focused on putting up a perimeter fence to keep others out, but it has evolved to monitoring activities and identifying patterns that would have been missed before. Information security professionals face the challenge of detecting malicious activity in a constant stream of discrete events that are usually associated with an authorized user and are generated from multiple network, system and application sources. At the same time, security departments are facing increasing demands for ever-greater log analysis and reporting to support audit requirements. A variety of complimentary (and sometimes overlapping) monitoring and analysis tools help enterprises better detect and investigate suspicious activity – often with real-time alerting or transaction intervention. By understanding the strengths and weaknesses of these tools, enterprises can better understand how to use them to defend the enterprise and meet audit requirements.

Flash Memory. Flash memory is not new, but it is moving up to a new tier in the storage echelon. Flash memory is a semiconductor memory device, familiar from its use in USB memory sticks and digital camera cards. It is much faster than rotating disk, but considerably more expensive, however this differential is shrinking. At the rate of price declines, the technology will enjoy more than a 100 percent compound annual growth rate during the new few years and become strategic in many IT areas including consumer devices, entertainment equipment and other embedded IT systems. In addition, it offers a new layer of the storage hierarchy in servers and client computers that has key advantages including space, heat, performance and ruggedness.

Virtualization for Availability. Virtualization has been on the list of top strategic technologies in previous years. It is on the list this year because Gartner emphases new elements such as live migration for availability that have longer term implications. Live migration is the movement of a running virtual machine (VM), while its operating system and other software continue to execute as if they remained on the original physical server. This takes place by replicating the state of physical memory between the source and destination VMs, then, at some instant in time, one instruction finishes execution on the source machine and the next instruction begins on the destination machine.

However, if replication of memory continues indefinitely, but execution of instructions remains on the source VM, and then the source VM fails the next instruction would now place on the destination machine. If the destination VM were to fail, just pick a new destination to start the indefinite migration, thus making very high availability possible. 

The key value proposition is to displace a variety of separate mechanisms with a single “dial” that can be set to any level of availability from baseline to fault tolerance, all using a common mechanism and permitting the settings to be changed rapidly as needed. Expensive high-reliability hardware, with fail-over cluster software and perhaps even fault-tolerant hardware could be dispensed with, but still meet availability needs. This is key to cutting costs, lowering complexity, as well as increasing agility as needs shift.

Mobile Applications. By year-end 2010, 1.2 billion people will carry handsets capable of rich, mobile commerce providing a rich environment for the convergence of mobility and the Web. There are already many thousands of applications for platforms such as the Apple iPhone, in spite of the limited market and need for unique coding. It may take a newer version that is designed to flexibly operate on both full PC and miniature systems, but if the operating system interface and processor architecture were identical, that enabling factor would create a huge turn upwards in mobile application availability.

“This list should be used as a starting point and companies should adjust their list based on their industry, unique business needs and technology adoption mode,” said Carl Claunch, vice president and distinguished analyst at Gartner. “When determining what may be right for each company, the decision may not have anything to do with a particular technology. In other cases, it will be to continue investing in the technology at the current rate. In still other cases, the decision may be to test/pilot or more aggressively adopt/deploy the technology.”

Article copied from: http://www.gartner.com/it/page.jsp?id=1210613

Sideline Topic: Looking for Feedback on What YOU THINK about CMS’ 2010 Technology Predications

Can it be true that finally, in 2010, the market focus within the technology sector will finally shift to customer-facing systems and internal applications delivering more meaningful content applicability?

Looking back at the content of my own blog, me and my readers (thank you, lovely readers!) have been feeling the need for business intelligence to step back into customer intelligence once again, a place we (BI practitioners) have been for a while. And while, we, the go forward and capture the world Gen X, Y and Z’ers have shifted the realm of what we need in terms of content delivery (these are the generations of the “serve it up TO US in a Google-style fashion, otherwise, I am too busy to look for the information on your website” crowds where texting is the preferred vehicle for communications & anything that requires more than two hops deep to get to the information we need is one step too many – sad, but true ; and those that realize this fact of life now will adjust and survive when this generation, now in college, graduates and enters into our realm of the workplace). And so I bring you CMS’ predications, followed by the tried and true Gartner predications for comparison sake – Please let me know what you think, what your own predications are or any other comments you want to share! J’accueille un nouvel an – how about you? :)

Article copied from : http://www.information-management.com/news/ecm_serach_cloud_sharpoint_mobile_document_management-10016801-1.html?msite=cloudcomputing

The current recessionary period in particular will yield many content technology investments focused on customer-facing systems, CMS Watch founder, Tony Byrne was quoted to say. “In 2010 we will see a renewed focus on internal applications.”

  1. Enterprise content management and document management will go their separate ways.
  2. Faceted search will pervade enterprise applications.
  3. Digital asset management vendors will focus on SharePoint integration over geographic expansion.
  4. Mobile will come of age for document management and enterprise search.
  5. Web content management vendors will give more love to intranets.
  6. Enterprises will lead thick client backlash.
  7. Cloud alternatives will become pervasive.
  8. Document services will become an integrated part of enterprise content management.
  9. Gadgets and Widgets will sweep the portal world.
  10. Records managers face renewed resistance.
  11. Internal and external social and collaboration technologies will diverge.
  12. Multilingual requirements will rise to the fore.

BOTHER – Adding Value through Effective Relativization and Rationalization of KPIs

I posted a comment to a blog about KPIs:

  • http://mybibeat.wordpress.com/2009/11/23/how-to-define-and-select-good-kpis-for-your-business/#comment-10
  • and I thought it cited a perfect next step in my series on how to revamp your investment or the BOTHER program:

    I am typically the consultant of KPIs, so I found it humbling to have a need for some relevant information on KPIs. I say this with the greatest of respect for your posting. While I agree on most points, I wanted to challenge one area, and hope you will take this with a pinch of salt…
    You cite an example for a professional services company: "The survival of a professional service provider depends on the number of ongoing and new projects the company handles" and go on to mention revenue. The follow up you provide is great for measuring the $$ impact yes, but you do not mention the satisfaction of the ongoing client. While measuring revenue of the ongoing client is meaningful, it isnt wholly satisfying – Here’s why:

    In order to grow a professional services company, one must be able to project the likelihood of growing within existing company where ongoing revenue is currently being realized. And why a company may continue to use a services company because they either havent the time to find a new one, or the energy to fire them, it doesnt necessarily mean they will give the next big project to said consultancy because they may actually be dissatisfied, though their continue use of the company for ongoing support may give the feeling that ‘we are ok; they havent fired us after all’;
    In reality, they may be unhappy and as I mentioned for the reasons above, just havent changed the ongoing work to someone else, but certainly is not planning on signing over any new work.

    Instead of just measuring revenue, I would counter that for KPIs to be effective, one must rationalize and relativize by looking at qualitative and quantitative measures like satisfaction points lost or gained by project over dollars (revenue), thus, quantifying the average revenue per satisfaction point gained or lost. One must use statistical analysis to get this metric accurately (Partial Least Squares modeling is a wonderful tool for this), but boy it is powerful. A) You know how satisfied a given client actually is and B) you know that if you start to lose or gain satisfaction points though retention programs or client appreciation, or inversely through lack of attention or project dissatisfaction, you start losing points, how much it means to your bottom line, and where to drive some of your business development resources.
    It takes what you stated very well to the next level of efficacy which analytics is where I believe the true value of BI starts to become realized. Thank you though for your point. I will certainly link my blog back to yours!

    “BOTHER” (Before Offering To Heavily Expense, R$Invent): Unearthing BI Insights in the Most Unlikely of Places: the Existing Pools of Information Within Your Workplace

    Yes. As many of you know, I took a hiatus from my blog to really get down to the nuts and bolts of business intelligence. As a BI solutions architect working for a leading BI consultancy, I get the wonderful benefit of getting to experience many different perspectives of BI in the workplace. Each year, I also get to watch how the industry grows in it’s implicit pervasiveness (and no, I do not mean BI pervasiveness, as in the latest “catch phrase” // one I would release BTW if you have added it to your vocabulary of late, but I digress)…No, what I mean, is the textbook definition : I get to watch how business intelligence is infecting the lives of countless employees, all with the promise of making lives easier, better, faster, eliminating manual practices without eliminating their jobs, enhancing decision making with data, empowering, illuminating, targeting, and the list goes on and on. But the reality of what I get to witness is that business intelligence has become the CRM that we all wanted to avoid; the nomenclature of late; the popular trend in the workplace, where one gets to order a dashboard on the side with their reporting platform. Oh, and for just a few dollars more, you can super size your order and get those ridiculous pie charts, with not just 3-D rendering capabilities but also in every hue and shade of the color spin-wheel of life…Spin a red, and lose your job, spin a green, and move up the proverbial corporate hierarchy – Opening up the paradigm of pandora’s BI box is not for the light hearted…Stop reading now if you are already getting queasy.

     

    What was once a new area of interest for me, the uncovering of KPIs or key performance indicators, and building strategically targeted performance management programs around them, has sadly manifested into the CRM nightmare I predicated over 3 years ago in this very blog. When any one thing gets overexposed (think of that infamous hotel heiress), we are systematically programmed to shift focus elsewhere; the new, being more interesting than the old. And the ‘WHAT’ that dashboards tell (*it’s Red, it’s Green*) gets very old, very quickly. And before one measures the efficacy of their measurement system, (i.e. – metadata measurements of the usefulness *qualitatively* of their BI program, practitioners leap the quantum conclusion that BI isn’t effective anymore /wasn’t ever effective, in their respective work environments. 9 x out of 10, it is the leaders, who thought they wanted that extra helping of BI, after attending that year’s conference, flavored with some Norton or Kaplan or more generalized BI user conferences sponsored by the software vendors (slight shudder thinking about all of those workplace leaders who don’t know what they don’t know, and are invited by self-interested ‘teachers’ whose altruism stops as soon the invoice is signed and dated.) But again I digress.

    And really, when it is all boiled down, besides being what you may perceive to be a rant on my part, is an impassioned plea for the next wave of BI to begin…A call to arms, for the vets of our industry to stop self-promoting for one second and to start helping those build better with what they already have. To stop buying the new flavors of this month, and stick with the vanilla or chocolate or strawberry . That’s what is so great about keeping it simple. Not only can you slice something individual and enjoy its flavor for a lifetime of richness personally; but if and when the flavor stops providing what you need, you can always layer on to the basics and actually create something new, like 2-tiered Swirls or for those even more risky, the 3-tiered über swirl, known to us ice cream aficionados as Neapolitan. 

    Now, I realize one must crawl before they walk, and walk before they run, or at least, so I am told whenever I step onto my soapbox and herald for change in the way BI is being implemented today. Whether I champion change on the street or in this case, the cubicle aisles of the workplace, beating my drum, that yes, Virginia, there is a Santa Claus, and he likes it when BI delivers what it promises AND CAN deliver. It is up to the practitioners to hang up their green-colored glasses and start thinking about all of those reasons they got into BI in the 1st place: and it is a powerful step back to take; one I can personally speak to having just returned from my sojourn, now with a greater understanding that understanding ‘WHAT’ about a business is the surface level cut that indicators or operational reporting will yield. But looking further at the usefulness of one’s metrics, asking those painful questions like “what are you going to do with that data” instead of just becoming a reporting jockey, driving the ‘WHAT’ down to the ‘WHY’ is only half the battle; it is the ‘WHAT CAN BE DONE ABOUT IT’ that takes it to a whole other level. And only those analysts, truly inundated with the data from all areas of the company, not just finance, or operations, but market research, retail, development, etc., who can truly answer the timeless question of ‘So what do we do now;’ because they have the data to steer the powers that be in that direction –

    This isn’t a tool that I am prescribing; it might be spreadsheets and hours of analyst bandwidth to finally arrive where you need to be to make your BI programs and platforms useful. And the only way to get there, is to take a step back and examine your business frankly, ask the right people the right questions, and finally, question (with respect) the answers you get or keep asking the FIVE WHY’s, until you get to root cause of an existing platforms efficacy. Otherwise, if you don’t change your approach, you will always get what you have always got’ and trust me, there are only a handful (< 5% of companies) doing this today; start now, by pulling of the band-aid, or kicking the crutch of expenditure away, and use what you got to explore what you have in your data stores, manual as it may be, to find the nuggets of gold you want to be successful. No, let me rephrase. The nuggets you NEED to be successful. Check back over the next few days for actual steps to achieve success. I will prescribe a 5 step DON’T BOTHER plan of attach for reinventing your BI program before you reinvest starting with where to start getting down and dirty with your existing analytics – This is not for the faint hearted – you’ll find many of today’s business intelligence practitioners tend to avoid, not know about, or are too intimidated to uncover what I will reveal – As we move into the new year, why not shift the paradigm of your existing BI mindset by taking a bite from the beefiest side of all: the analytic?!!!

    More to come tomorrow…

    Business intelligence: Adding value to daily decisions

     

    Business intelligence in hospitality: Adding value to daily decisions

    Insight you can act on equals business success



    Related Links


    Microsoft Business Intelligence Web site


    Microsoft Hospitality Web site


    IDC vendor share 2005


    Integrated IT platform integrates, personalizes guest experience



    Related Products

    Microsoft Windows Server "Longhorn"

    Microsoft SQL Server

    2007 Microsoft Office system

    Microsoft Office PerformancePoint Server 2007

    After more than 10 years, business intelligence (BI) is catching on. In many organizations, everyone from C-level executives to the controller to the chef rely on dashboards, scorecards, and daily reports to provide information about their business and the entire enterprise.

    A recent IDC Research study ("Worldwide Business Intelligence Tools 2005 Vendor Shares," October 2006, #202603) found that organizations are looking for more than just tools for queries and reports. People want insight from their BI solution to support collaborative analysis, forecasting, and decision-making, so that BI can help drive better business processes—and results. Microsoft BI solutions can provide such support—and have helped companies such as Hilton and Expedia save money, provide superior guest service, and improve business performance and the bottom line. In this article, we’ll discuss how the Microsoft Business Intelligence platform can help your company.

    On This Page


    Insight you can act on


    Keeping scorecards to track BI


    A system everyone can use


    It’s all about forecasting


    Business intelligence and beyond


    The Microsoft Business Intelligence solution


    Next steps


    Insight you can act on

    "The trend now is to move from reporting about the past to studying targeted information about how key metrics or key performance indicators (KPIs) compare to current goals," says Sandra Andrews, industry solutions director in the Retail & Hospitality group at Microsoft. "Delivering the right information to the right people in the right format at the right time is critical. Empowering employees with real-time views of where the business is now and where it’s headed adds value to daily decisions."

    To accurately manage and forecast, you need an integrated system that provides one version of the truth, and then you need that information to be easily accessible to your teams. But many organizations in the hotel industry are still using different BI tools in different departments. Complicating the matter more—companies use separate systems for different locations. As a result, it can be extremely difficult to standardize information and reports, forecast staffing and supply needs, let alone provide real-time analytics. However, the business benefits for delivering information to people in a format they can use to take action or make better business decisions far outweighs the costs.

    Top of page


    Keeping scorecards to track BI

    Scorecarding is an efficient, immediate way to capture the key data you need. Recently, Expedia implemented a scorecard solution to better serve online customers and put complex Web performance metrics and KPI at its analysts’ fingertips. The result? Automated data collection saved time and effort, allowing analysts to spend their time developing answers rather than crunching numbers.

    "Customer satisfaction is essential to helping make Expedia a great company. With scorecarding, we have the means to evaluate how well we are doing to make the company even greater," says Laura Gibbons, manager of Customer Satisfaction & Six Sigma at Expedia. "And if scorecarding is adopted throughout the company, I believe we are that much closer to becoming the largest and most profitable seller of travel in the world."

    Top of page


    A system everyone can use

    Making sense of enormous quantities of rapidly changing data, visualizing and prioritizing that information, and holding the organization accountable for specific performance metrics is essential for success. If you have insight that you can act on, then you can align those activities with corporate goals and forecasts. And by empowering people through familiar tools, you make it easier for your employees to access the information they need to build relationships with guests.

    The Microsoft Business Intelligence platform leverages the Microsoft Office system on the front end, helping you create a BI solution that your people can use easily, without a steep learning curve. "Managers and executives can create reports in Excel, link them to PowerPoint, and easily update their reports and presentations. Hotel managers are already using Excel," Andrews says. "No matter what BI tool organizations adopt, ultimately the user extracts the data into an Excel file to manipulate it. By giving your people the information they need in the Office system right from the start, you reach all employees and increase collaboration. You change the way your company works."

    Top of page


    It’s all about forecasting

    To provide the type of service that generates customer loyalty, you need to be able to pull data from multiple systems to analyze guest profiles, forecast trends, determine occupancy rates, or predict food and beverage sales. The right BI solution can help you manage your business, increase productivity, and provide the excellent service that builds customer loyalty.

    For example, Hilton Hotels wanted an adaptable, scalable solution that would include demand-based pricing and improve forecasting for group, catering, and public-space sales. Hilton leveraged Microsoft’s Business Intelligence platform, deploying Microsoft SQL Server 2005 and using SQL Server Analysis and Reporting Services all running on the Microsoft Windows Server 2003 operating system. As a result of the Microsoft BI solution, Hilton increased their data processing rate by 300 percent. They reduced catering forecast time by 25 percent. And they improved customer service by accommodating more catering requests, all with a 15-percent reduction in deployment time. Kathleen Sullivan, vice president, Sales and Revenue Management Systems at Hilton Hotels, says, "SQL Server 2005 provides Hilton with the power and extensibility to deliver revenue analysis and forecasting capabilities."

    Top of page


    Business intelligence and beyond

    One trend that’s already changing revenue and channel management is how BI is fueling a better understanding of convention space and catering needs to generate revenue for sales and catering.

    Organizations are integrating customer relationship management (CRM) sales tools with business intelligence to help book their conventions and catering events. The sales department can determine which event will bring in the most all-property revenue. Harrah’s Entertainment, a forerunner in innovative use of BI, is using customer intelligence and CRM strategies for tracking and increasing customer loyalty. Harrah’s hands out credits to their guests each time they visit the casino and play games. Harrah’s then tracks visits and the more the guest visits, the greater the value of the reward. Harrah’s can predict the value of each guest, their habits, and how to increase each guest’s total revenue per available room (REVPAR).

    Top of page


    The Microsoft Business Intelligence solution

    IDC’s competitive analysis report, "Worldwide Business Intelligence Tools 2005 Vendor Shares," found that Microsoft’s BI tools revenue growth was more than twice that of the other leading database management systems (DBMS) and legacy pure-play BI vendors.

    The Microsoft Business Intelligence platform is a complete and integrated solution. Whether you use it as your data warehouse platform, your day–to-day user interface, or as an analysis and reporting solution, Microsoft provides the fastest growing business intelligence platform to support your needs. The Microsoft BI solution includes the following servers and client tools to enhance your business:

    Microsoft SQL Server 2005 (along with Visual Studio 2005 and BizTalk Server 2006) provides advanced data integration, data warehousing, data analysis, and enterprise reporting capabilities to help ensure interoperability in heterogeneous environments and speed the deployment of your BI projects.

    Microsoft SQL Server Reporting Services is a comprehensive, server-based reporting solution designed to help you author, manage, and deliver both paper-based and interactive Web-based reports.

    Microsoft SQL Server Integration Services (SSIS) is a next generation data integration platform that can integrate data from any source. SSIS provides a scalable and extensible platform that empowers development teams to build, manage, and deploy integration solutions to meet unique integration needs.

    Microsoft SQL Server Analysis Services (SSAS) provides tools for data mining with which you can identify rules and patterns in your data, so that you can determine why things happen and predict what will happen in the future – giving you powerful insight that will help your company make better business decisions. SQL Server 2005 Analysis Services provides, for the first time, a unified and integrated view of all your business data as the foundation for all of your traditional reporting, online analytical processing (OLAP) analysis, KPI scorecards, and data mining.

    End-user tools build on the BI platform capabilities of Microsoft SQL Server.

    Microsoft Office Excel 2007 helps you to securely access, analyze, and share information from data warehouses and enterprise applications. And maintain a permanent connection between their Office Excel spreadsheet and the data source.

    Microsoft Office SharePoint Server 2007 becomes a comprehensive portal for all of the BI content and end-user capabilities in SQL Server Reporting Services and the Microsoft Office 2007 release, providing secure access to business information in one place. Excel Services allow customers to more effectively share and manage spreadsheets on the server.

    Microsoft Office PerformancePoint Server 2007 offers an easy to use performance management application spanning business scorecarding, analytics, and forecasting to enable companies to better manage their business.

    Business intelligence: Adding value to daily decisions

    Anonymous Web surfing at your fingertips…IP Proxy Servers

    del.icio.us Tags: ,,,

    Not saying you should do this, but don’t those pesky IT administrators with their  firewalls and IP blocks, disallowing work time visits to social networking sites, are really preventing a highly valuable word of mouth marketing for their companies through said social networks in my opinion. All things in moderation, and I do realize that there are those “abusers” of said rites who would spend all day on Facebook, or checking their Twitter timelines. But for the mainstreamers, like myself, who “get it”, social networking provide companies with great recruiting vehicles especially when the word of mouth comes from inside of the company’s walls from other employees. It is also a great lead generator for consultancies like my own, so blocking access seems counter intuitive and archaic in my humble opinion. Here is a list of IP addresses to use as proxy servers within your Internet Options. (Go to Tools, Internet Options, Connections, Proxy Server/Port field to utilize). I recommend you use one listed in the US.

     

    All responsibility around usage of said proxies falls on YOU dear reader, so keep in mind this list was gained from a publically available proxy listing service and should be used with caution and NOT for illicit or scandalous purposes, realizing that any harm done is YOUR responsibility/owned by YOU and NOT mine.


     

    Care of Proxy4Free.com

    MENU

    HOME

    PROXY LIST

    proxy list 1

    proxy list 2

    proxy list 3

    proxy list 4

    proxy list 5


    Proxy List 1

     

    IP
    Port
    Type
    Country
    Last Test

    148.233.159.58
    8080
    anonymous
    Mexico
    2009-07-13
    Whois

    84.255.246.20
    80
    anonymous
    Slovenia
    2009-07-13
    Whois

    67.69.254.250
    80
    anonymous
    Canada
    2009-07-13
    Whois

    67.69.254.254
    80
    anonymous
    Canada
    2009-07-13
    Whois

    125.245.160.130
    8080
    anonymous
    South Korea
    2009-07-13
    Whois

    67.69.254.248
    80
    anonymous
    Canada
    2009-07-13
    Whois

    218.14.227.197
    3128
    anonymous
    China
    2009-07-13
    Whois

    218.6.16.162
    80
    anonymous
    China
    2009-07-13
    Whois

    93.123.104.66
    8080
    anonymous
    2009-07-13
    Whois

    41.210.252.11
    8080
    anonymous
    Angola
    2009-07-13
    Whois

    78.41.19.30
    3128
    anonymous
    Czech Republic
    2009-07-13
    Whois

    218.75.100.114
    8080
    anonymous
    China
    2009-07-13
    Whois

    193.37.152.154
    3128
    anonymous
    Germany
    2009-07-13
    Whois

    86.101.185.109
    8080
    anonymous
    Hungary
    2009-07-13
    Whois

    78.108.96.47
    8080
    anonymous
    Czech Republic
    2009-07-13
    Whois

    86.101.185.97
    8080
    anonymous
    Hungary
    2009-07-13
    Whois

    200.174.85.195
    3128
    transparent
    Brazil
    2009-07-13
    Whois

    61.172.244.108
    80
    anonymous
    China
    2009-07-13
    Whois

    202.98.23.114
    80
    anonymous
    China
    2009-07-13
    Whois

    64.29.148.15
    80
    high anonymity
    United States
    2009-07-13
    Whois

    121.58.96.10
    3128
    anonymous
    China
    2009-07-13
    Whois

    200.65.129.1
    80
    anonymous
    Mexico
    2009-07-13
    Whois

    64.12.223.232
    80
    anonymous
    United States
    2009-07-13
    Whois

    189.108.102.138
    3128
    anonymous
    Brazil
    2009-07-13
    Whois

    121.204.0.2
    80
    anonymous
    China
    2009-07-13
    Whois

    67.69.254.246
    80
    anonymous
    Canada
    2009-07-13
    Whois

    203.160.1.75
    80
    anonymous
    Vietnam
    2009-07-13
    Whois

    203.160.001.112
    80
    anonymous
    Vietnam
    2009-07-13
    Whois

    64.12.222.232
    80
    anonymous
    United States
    2009-07-13
    Whois

    119.70.40.102
    8080
    anonymous
    South Korea
    2009-07-13
    Whois

    143.215.129.230
    3128
    anonymous
    United States
    2009-07-13
    Whois

    59.39.145.178
    3128
    anonymous
    China
    2009-07-13
    Whois

    203.162.183.222
    80
    transparent
    Vietnam
    2009-07-13
    Whois

    67.227.132.249
    80
    high anonymity
    United States
    2009-07-13
    Whois

    121.9.221.188
    80
    high anonymity
    China
    2009-07-13
    Whois

    67.69.254.245
    80
    anonymous
    Canada
    2009-07-13
    Whois

    85.214.81.233
    3128
    anonymous
    Germany
    2009-07-13
    Whois

    200.174.85.193
    3128
    transparent
    Brazil
    2009-07-13
    Whois

    217.169.182.206
    8080
    anonymous
    Czech Republic
    2009-07-13
    Whois

    222.68.207.11
    80
    anonymous
    China
    2009-07-13
    Whois

    60.29.241.102
    80
    anonymous
    China
    2009-07-13
    Whois

    222.218.156.66
    80
    anonymous
    China
    2009-07-13
    Whois

    203.160.1.66
    80
    anonymous
    Vietnam
    2009-07-13
    Whois

    121.12.249.207
    3128
    anonymous
    China
    2009-07-13
    Whois

    222.68.206.11
    80
    anonymous
    China
    2009-07-13
    Whois

    Places Where You Can Find More Proxy Lists

    Site 01
    Site 02
    Site 03
    Site 04
    Site 05
    Site 06
    Site 07
    Site 08
    Site 09
    Site 10

    Name
    Port
    Type
    Country
    Last Test

    219.137.229.218
    3128
    anonymous
    China
    2009-07-13
    Whois

    64.29.148.30
    80
    high anonymity
    United States
    2009-07-13
    Whois

    221.130.191.216
    8080
    anonymous
    China
    2009-07-13
    Whois

    84.1.150.30
    8080
    anonymous
    Hungary
    2009-07-13
    Whois

    61.172.249.96
    80
    anonymous
    China
    2009-07-13
    Whois

    203.160.1.85
    80
    anonymous
    Vietnam
    2009-07-13
    Whois

    78.154.132.241
    8080
    anonymous
    Latvia
    2009-07-13
    Whois

    222.124.190.12
    8080
    anonymous
    Indonesia
    2009-07-13
    Whois

    114.30.47.10
    80
    anonymous
    Australia
    2009-07-13
    Whois

    118.175.255.10
    80
    anonymous
    Thailand
    2009-07-13
    Whois

    61.152.246.226
    80
    high anonymity
    China
    2009-07-13
    Whois

    67.69.254.253
    80
    anonymous
    Canada
    2009-07-13
    Whois

    80.148.27.97
    8080
    anonymous
    Germany
    2009-07-13
    Whois

    67.69.254.240
    80
    anonymous
    Canada
    2009-07-13
    Whois

    203.160.1.94
    80
    anonymous
    Vietnam
    2009-07-13
    Whois

    141.85.118.1
    80
    high anonymity
    Romania
    2009-07-13
    Whois

    200.65.127.161
    3128
    anonymous
    Mexico
    2009-07-13
    Whois

    213.180.131.135
    80
    anonymous
    Poland
    2009-07-13
    Whois

    219.255.135.180
    80
    anonymous
    South Korea
    2009-07-13
    Whois

    193.37.152.206
    3128
    anonymous
    Germany
    2009-07-13
    Whois

    119.167.225.136
    8080
    anonymous
    China
    2009-07-13
    Whois

    189.56.61.33
    3128
    anonymous
    Brazil
    2009-07-13
    Whois

    201.147.20.245
    80
    anonymous
    Mexico
    2009-07-13
    Whois

    119.40.99.2
    8080
    anonymous
    Mongolia
    2009-07-13
    Whois

    80.90.82.93
    80
    anonymous
    Albania
    2009-07-13
    Whois

    61.172.249.94
    80
    anonymous
    China
    2009-07-13
    Whois

    83.230.181.116
    3128
    anonymous
    Spain
    2009-07-13
    Whois

    67.69.254.252
    80
    anonymous
    Canada
    2009-07-13
    Whois

    86.101.185.98
    8080
    anonymous
    Hungary
    2009-07-13
    Whois

    200.65.129.2
    80
    anonymous
    Mexico
    2009-07-13
    Whois

    121.9.221.187
    80
    high anonymity
    China
    2009-07-13
    Whois

    195.229.150.7
    80
    anonymous
    United Arab Emirates
    2009-07-13
    Whois

    60.12.226.18
    80
    anonymous
    China
    2009-07-13
    Whois

    67.91.182.64
    3128
    anonymous
    United States
    2009-07-13
    Whois

    125.70.229.30
    8080
    anonymous
    China
    2009-07-13
    Whois

    67.69.254.247
    80
    anonymous
    Canada
    2009-07-13
    Whois

    208.78.125.18
    80
    anonymous
    United States
    2009-07-13
    Whois

    148.233.229.235
    3128
    anonymous
    Mexico
    2009-07-13
    Whois

    64.37.184.17
    80
    high anonymity
    United States
    2009-07-13
    Whois

    203.160.001.103
    80
    anonymous
    Vietnam
    2009-07-13
    Whois

    86.101.185.112
    8080
    anonymous
    Hungary
    2009-07-13
    Whois

    202.98.23.116
    80
    anonymous
    China
    2009-07-13
    Whois

    61.172.246.180
    80
    anonymous
    China
    2009-07-13
    Whois

    218.28.176.246
    3128
    anonymous
    China
    2009-07-13
    Whois

    67.69.254.243
    80
    anonymous
    Canada
    2009-07-13
    Whois

    Optimizing BI Operational Reports with Internal IT Ticketing / CRM Systems

     

    How often do you think about optimizing your operational reporting processes with your internal ticketing system / IT CRMs? Probably not as often as you would like -

    Being a Lean Six Sigma Black Belt, I can’t help but think about these things.

    In the process of promoting a report between a test environment and a production environment often involves customer communication in the form of an email – Why not standardize and automate that process?

    First, you should have an inventory of your reports with an ID or CUID. This can be extracted from the BusinessObjects or BI provider auditor universe/logs respectively; in a worst case, start reporting off of your SQL data source instances or event logging tables.

    Here is an example of C# code that automates the promotion process and generates a nice user friendly output which is standardized, and calls out the folder class where the report lives, provides a login link or if using SSO, a pass through token to log the user in auto-magically. Anything in maroon are comments for you , dear blog reader; anything in navy is a part of the actual email content.

    “Your report has been created and placed on the TEST system for testing.

    Please login (by clicking the link below) and test the report.
    http://<servername>:<port>//InfoViewApp/logon.jsp

    Your report can be found in: \\" target=_blank>\\<LOB folder>\<Department folder>\<Department subfolder>

    The report name is: <ReportName>

    Hide the CUID, Display the mapped report name only to end users like you see above.

    Link using OpenDocument and append to C# code: http://<servername>:<port>/OpenDocument/opendoc/<platformSpecific><parameter1>&<parameter2>&…&<parameterN>

    OR, you can go directly and get updated or new report here  :
    You will have to login manually if you use this link. Remember to change the default Authentication (on the login window) to "Windows AD" or your respective authentication method (“Enterprise” or “LDAP” are your other choices).
    Once the report has been tested please let me know by re-opening the ticket so I can move it to the production system.

    Untested reports are purged every 30 days. Should you want to make any further changes to an existing report outside of data quality corrections, please open a new ticket but reference it to the old one for tracking purposes. Thank you for your kind consideration and adherence to the BI team report process.”

    Note:
    To obtain the document ID, navigate to the document within the Central Management Console (CMC).
    The properties page for the document contains the document ID and the CUID. Use this value for the iDocID parameter.

    Libra – a week of house cleaning and independence is in store for you!

    Your Horoscope – This Week (April 26 – May 2, 2009)

    Don’t start or decide on anything of matter on Monday or Tuesday – Moon and Mars are in your ruling house where the Hermit is deriving a joyous solitude that may surprise you dear Libra, if you choose to listen. Your key word is independent, so break the chains of codependency now. Your love life looks hotter than ever. You can’t escape the demands and desires of your lover. The presence of Mars in your relationship zone indicates it’s time to clear the air. If there are any issues that have been pushed under the carpet, they’re about to be exposed. You may find your partner a lot more argumentative than usual. Talking over difficulties will help the energy in the relationship to move instead of stagnate. There may be some turbulence, but you’ll also feel a lot better for having shared your feelings.

     

    Your Horoscope – April 2009

    You’ll be torn between work and personal matters at home and need to find balance on April 1 and 2. Don’t let others push you into making decisions quickly. You’ll have the chance to establish warm and affectionate bonds and enjoy life on the weekend of April 4. Catch up on a work project on the evening of April 5. Take your obligations very seriously on April 6 and 7. The Moon in your sign on April 8 and 9 will relax you and help you take it easy for a couple of days. You’ll have to be proactive and assertive in some situations, though, and not wait for developments. Relationships may be strained on the weekend of April 11 if you don’t deal with issues immediately. April 13 and 14 would be a good time to take a day trip if you feel the need for a change of scene. Burn off some excess energy by hitting the gym. Work will be intense on April 15 and 16 and you may need to put in some extra hours to get things finished. You may want to get involved in a humanitarian cause or at least be of help to someone in need on the weekend of April 18. A burst of energy on April 22 and 23 may be short lived but will motivate you to start new projects. Do some reading or call a friend on the evening of April 26. You’ll feel frustrated by a critical person in authority on April 27 and 28.

    Attn: Northwest BI Professionals – Register now for the next TDWI NW Chapter Meeting

     

    Date: May 14, 2009

    Time: 5:30–8:00PM, with billiards and networking to be held at the Parlor immediately following event

    Location: Lincoln Square, 700 Bellevue Way NE, Bellevue, WA 98004 (see map below)


    Lincoln Square

    Speaker: Dave Wells, TDWI Research Director and Avid Conference Speaker

    Customer Speaker: Vincent Ippolito, Washington Dental Services’ Director of BI

    Topic: How to Deploy BI Programs in Time of Economic Hardship

    Registration is free. Food is free to attendees. And best of all (unlike those other Data Organizations), you DONT have to be a member to attend, nor pay to attend even if you ARE NOT currently a member.

    Space is extremely limited and advanced registration is recommended

    Link to Register: http://1105media.inquisiteasp.com/cgi-bin/qwebcorporate.dll?P5RVKQ

    TDWI NW Page: http://www.twdi.org/northwest

    Re-visiting Organizational Objectives and Values

    Simplistically speaking, the BSCOL (Balanced Scorecard Collaborative) defines the cascaded model approach for linking corporate values with individual’s performance review goals/objectives. Starting at the bottom and looking at what each individual’s personal goals are, and flowing up from there to the departmental goals, the division goals and finally the executive tier strategy/vision/goals/objectives, will help you to see where you have gaps in your values to what your employees are driving your company towards, versus where you have alignment.

     

    Restructuring those values either at the top (harder) or at the individual contributor level at the bottom (easier) to ensure alignment will both drive better performance from your people because of the visibility offered to them by demonstrating how what they are tasked to complete in a year are contributing factors to helping the company achieve its organizational values. If your start with existing values, and then add what the existing objectives are as a starting point, see if you can map those together. 9x out of 10, they will NOT be aligned and that is a big AH-HA for many leaders to see on paper.

     

    Then start to cascade from there to the division leader tier, the department management tier and lastly, to the individual contributors. That is the vertical alignment process from top to bottom, if that is your preference. Once you have these vertical lines mapped, look to see overlap or conflicting values between divisions, departments and people and find affinity areas that can be mapped logically back to the values where you started (in a top down approach). BSCOL.ORG and my blog (shameless plug) are both great resources offering excellent templates to assist in this process, like strategy maps. (See graphic provided by the TDWI BI Journal below) with a twist = Instead of using the 4 perspectives or in conjunction with (as that is very valuable in and of itself), use your organizational hierarchy instead. Financial becomes CEO’s Established Organizational values/objectives, Customer becomes the divisions that report into the CEO where the circles become that divisions values / goals/objectives, Internal becomes the departments and Learning and Growth becomes the Individual Contributors objectives that their manager lists out in those pesky annual performance reviews

     

    (sorry to those big believers but until true performance management like what I have outlined is institutional in all companies, the PR system is a bell curved sham where some of the best employees get the short end of the bell curve stick because how could one department have all highest performers even if they are a crackerjack team of employees. One day…A girl can dream right?…)

     

    -Laura Edell Gibbons

    New LinkedIn Group: TDWI NW Chapter: Get Your Social Intelligence On!

     Having fueled a social networking surge, TDWI has started embracing what is super exciting in my eyes: social networking and BI. TWITTER: http://www.twitter.com/lauragibbons (had to shamelessly self promote)

    TDWI NW CHAPTER: http://www.twitter.com/TDWI_NW_CHAPTER

    TDWI on TWITTER: http://www.twitter.com/TDWI

    LINKEDIN: http://www.linkedin.com/in/lauragibbons

    TDWI NW CHAPTER on LINKEDIN: http://www.linkedin.com/groups?about=&gid=820537&trk=anet_ug_grppro

    Enjoy! Laura Edell Gibbons, TDWI NW Chapter Board Officer & Chapter Secretary