Eye Tracking & Applied ML: Soapbox Validations

Anyone who has read my blog (shameless self-plug: http://www.lauraedell.com) over the past 12 years will know, I am very passionate about drinking my own analytical cool-aid. Whether during my stints as a Programmer, BI Developer, BI Manager, Practice Lead / Consultant or Senior Data Scientist, I believe wholeheartedly in measuring my own success with advanced analytics.  Even my fantasy football success (more on that in a later post) can be attributed to Advanced Machine Learning…But you wouldn’t believe how often this type of measurement gets ignored.
eyetracking
Introducing you, dear reader, to my friend “Eye-Tracker” (ET). Daunting little set of machines in that image, right?! But ET is a bonafide bada$$ in the world of measurement systems; oh yeah, and ET isn’t a new tech trend – in fact, mainstream  ET systems are a staple of any PR, marketing or web designers’ tool  arsenal  as a stick to measure program efficacy between user intended behavior & actual outcomes/actions.

In my early 20’s, I had my own ET experience & have been a passionate advocate since, having witnessed what happens when you compound user inexperience with poorly designed search / e-commerce operator sites.  I was lucky enough to work for the now uber online travel company who shall go nameless (okay, here is a hint: remember a little ditty that ended with some hillbilly singing “dot commmm” & you will know to whom I refer). This company believed so wholeheartedly in the user experience that they allowed me, young ingénue of the workplace, to spend thousands on eye tracking studies against a series of balanced scorecards that I was developing for the senior leadership team. This is important because you can ASK someone whether a designed visualization is WHAT THEY WERE THINKING or WANTING, even if built iteratively with the requestor. Why, you ponder to yourself, would this be necessary when I can just ask/survey my customers about their online experiences with my company and saved beaucorp $$.

Well, here’s why: 9x out of 10, survey participants, in not wanting to offend, will nod ‘yes’  instead of being honest, employing conflict avoidance at its best. Note, this applies to most, but I can think of a few in my new role who are probably reading this and shaking their head in disagreement at this very moment.

Eye tracking studies are used to measure efficacy by tracking what content areas engage users’ brains vs. areas that fall flat, are lackluster, overdesigned &/or contribute to eye/brain fatigue. It measures this by “tracking” where & for how long your eyes dwell on a quadrant (aka visual / website content / widget on a dashboard) and by recording the path & movement of the eyes between different quadrants’ on a page. It’s amazing to watch these advanced, algorithmic-tuned systems, pick up even the smallest flick of one’s eyes, whether darting to or away from the “above-fold” content, in ‘near’ real-time. The intended audience being measured generates the validation statistics necessary to evaluate how well your model fit the data. In the real-world, receiving attaboys or “ya done a good job” high fives should be doled out only after validating efficacy: eg. if customers dwell time increases, you can determine randomness vs. intended actual; otherwise, go back to the proverbial drawing board until earn that ‘Atta boy’ outright.

What I also learned which seems a no-brainer now; people read from Left Top to Right Bottom (LURB). So, when I see anything that doesn’t at LEAST follow those two simple principles, I just shake my head and tisk tisk tisk, wondering if human evolution is shifting with our digital transformation journey or are we destined to be bucketed with the “that’s interesting to view once” crowd instead of raising to the levels of usefulness it was designed for.

Come on now, how hard is it to remember to stick the most important info in that top left quadrant and the least important in the bottom right, especially when creating visualizations for use in the corporate workplace by senior execs. They have even less time & attention these days to focus on even the most relevant KPIs, those they need to monitor to run their business & will get asked to update the CEO on each QTR, with all those fun distractions that come with the latest vernacular du-jour taking up all their brain space: “give me MACHINE LEARNING or give me death; the upstart that replaced mobile/cloud/big data/business intelligence (you fill in the blank).
But for so long, it was me against the hard reality that no one knew what I was blabbing on about, nor would they give me carte blanche to re-run those studies ever again , And lo and behold, my Laura-ism soapbox has now been vetted, in fact, quantified by a prestigious University professor from Carnegie, all possible because a little know hero named Edmond Huey, now near and dear to my heart, grandfather of the heatmap, followed up his color-friendly block chart by building the first device capable of tracking eye movements while people were reading. This breakthrough initiated a revolution for scientists but it was intrusive and readers had to wear special lenses with a tiny opening and a pointer attached to it like the 1st image pictured above.
Fast forward 100 years…combine all ingredients into the cauldron of innovation & technological advancement, sprinkled with my favorite algorithmic pals: CNN & LSTM & voila! You have just baked yourself a popular visualization known as a heat/tree map (with identifiable info redacted) :
This common visual is  akin to eye tracking analytics which you will see exemplified in the last example below. Cool history lesson, right?

Even cooler is this example from a travel website ‘Travel Tripper’ which published Google eye-tracking results specific to the hotel industry. Instead of a treemap that you might be used to (akin to a Tableau or other BI tool visualization OOTB), you get the same coordinates laid out over search results in this example; imagine having your website underneath and instead of guessing what content should be above or below the fold, in the top left or right of the page, you can use these tried and true eye tracking methods to quantify exactly what content items customers or users are attracted to 1st and where their eyes “dwell” the longest on the page (red hot).

So, for those non-believers, I say, become a web analytic trendsetter, driving the future of machine design forward (ala “Web Analytics 3.0”).

Be a future-thinker, forward mover, innovator of your data science sphere of influence, always curious yet informed to make intelligent choices.

Advertisements

Futures According to Laura… Convergence of Cloud and Neural Networking with Mobility and Big Data

It’s been longer and longer between my posts and as always, life can be inferred as the reason for my delay.

But I was also struggling with feeling a sense of “what now” as it relates to Business Intelligence.

Many years ago, when I first started blogging, I would write about where I thought BI needed to move in order to remain relevant in the future. And those futures have come to fruition lately. Gamuts ranging from merging social networking datasets into traditional BI frameworks to a more common use case of applying composite visualizations to data (microcharts, as an example). Perhaps more esoteric was my staunch stance on the Mobile BI marriage which when iPhone 1 was released was a future many disputed with me. In fact, most did not own the first release of the iPhone, and many were still RIM subscribers. And it was hard for the Blackberry crowd to fathom a world unbounded by keyboards and scroll wheels and how that would be a game changer for mobile BI. And of course, once the iPad was introduced, it was a game over moment. Execs everywhere wanted their iPads to have the latest and greatest dashboards/KPIs/apps. From Angry Birds to their Daily Sales trend, CEOs and the like had new brain candy to distract them during those drawn out meetings. And instead of wanting that PDF or PowerPoint update, they wanted to receive the same data on their iPad. Once they did, they realized that having the “WHAT” is happening understanding was only the crack to get them hooked for a while. Unfortunately, the efficacy of KPI colors and related numbers only satisfies the one person show – but as we know, it isn’t the CEO who analyzes why a RED KPI indicator shows up. Thus, more levels of information (beyond the “WHAT” and  “HOW OFTEN”)  were needed to answer the “WHY” and “HOW TO FIX” the underlying / root cause issue.

The mobile app was born.

It is the reborn mobile dashboard that has been transformed into a new mobile workflow, more akin to the mobile app. 

But it took time for people to understand the marriage between BI dashboards, the mobile wave, especially the game change that Apple introduced with it’s swipe and pinch to zoom gestures, the revolution of the App Stores for the “need to have access to it now” generation of Execs, the capability to write-back from mobile devices to any number of source systems and how functionally, each of these seemingly unrelated functions would and could be weaved together to create the next generation of Mobile Apps for Business Intelligence. 

But that’s not what I wanted to write about today. It was a dream of the past that has come to fruition. 

Coming into 2013, cloud went from being something that very few understood to another game changer in terms of how CIOs are thinking about application support of the future. And that future is now.

But there are still limitations that we are bound by. Either we have a mobile device or not, either it is on 3 or 4G or wifi. Add to that our laptops (yes, something I believe will not dominate the business world in a future someday). And compound that with other devices like smartphones, eReaders, desktop computers et al. 

So, I started thinking about some of the latest research regarding Neural Networks (another set of posts I have made about the future of communication via Neural networks) published recently by Cornell University here (link points to http://arxiv.org/abs/1301.3605).

And my nature “plinko” thought process (before you ask, search for the Price is Right game and you will understand “Plinko Thoughts”) bounced from Neural Networks to Cloud Networks and from Cloud Networks to the idea of a Personal Cloud. 

A cloud of such personal nature that all of our unique devices are forever connected in our own personal sphere and all times when on our person. We walk around and we each have our own personal clouds. Instead of a mass world wide web, we have our own personal wide area network and our own personal wide web.

When we interact with other people, those people can choose to share their Personal networks with us via Neural Networking or some other sentient process, or in the example, where we bump into a friend and we want to share details with them, all of our devices have the capability to interlink to each other via our Personal Clouds. 

Devices are always connected to your Personal Cloud which is authenticated to your person, so that passwords which are already reaching their shelf life (see: article for more information on this point), are no longer the annoying constraint when we try to seamlessly use our mobile devices while on the go. Instead, they are authenticated to our Personal Cloud following similar principles as where IAM (Identity and Access Management) is moving towards in future. And changes in IAM are not only necessary for this idea to come to fruition but are on the horizon.

In fact, Gartner published an article in July 2012, called “Hype Cycle for Identity and Access Management Technologies, 2012” in which Gartner recognized that the growing adoption of mobile devices, cloud computing, social media and big data were converging to help drive significant changes in the identity and access management market.

For background purposes, IAM processes and technologies work across multiple systems to manage:

■ Multiple digital identities representing individual users, each comprising an identifier (name or key) and a set of data that represent attributes, preferences and traits

■ The relationship of those digital identities to each user’s civil identity

■ How digital user identities communicate or otherwise interact with those systems to handle
information or gain knowledge about the information contained in the systems

If you extrapolate that 3rd bullet out, and weave in what you might or might not know/understand about Neural Networking or brain-to-brain communication (see recent Duke findings by Dr. Miguel Nicolelis here) (BTW – the link points to http://www.nicolelislab.net/), one can start to fathom the world of our future. Add in cloud networking, big data, social data and mobility, and perhaps, the Personal Cloud concept I extol is not as far fetched as you initially thought when you read this post. Think about it.

My dream like with my other posts is to be able to refer back to this entry years from now with a sense of pride and “I told you so.” 

Come on – any blogger who makes predictions which come true years later deserves some bragging rites. 

Or at least, I think so…