Fishing in data pukes

When a data puke is presented periodically, consumers of the puke learn to “fish” for insights in it. 

I’ve been wondering why data pukes are so common. After all, they need significant effort on behalf of the consumer to understand what is happening, and to get any sort of insight from it. In contrast, a well-designed dashboard presents the information crisply and concisely.

In practical life, though, most business reports and dashboards I come across can at best be described as data pukes. There is data all over the place, and little insight to help the consumer to find what they’re looking for. In most cases, there is no customisation as well.

The thing with data  pukes is that data pukes beget data pukes. The first time you come across a report or dashboard that is a data puke, and you have no choice but to consume it, you work hard to get your useful nuggets from it. The next time you come across the same data puke (most likely in the next edition of the report, or the next time you come across the dashboard), it takes less effort for you to get your insight. Soon enough, having been exposed to the data puke multiple times, you become an expert at getting insight out of it.

Your ability to sift through this particular data puke and make sense of it becomes your competitive advantage. And so you demand that the puker continue to puke out the data in the same manner. Even if they were to figure out that they can present it in a better way, you (and people like you) will have none of that, for that will then chip away at your competitive advantage.

And so the data puke continues.

 

Shapely Gal

Well, you didn’t expect a relationships newsletter named after a game theoretic algorithm, did you? In any case, the wife has started one such, and I strongly urge you to subscribe.

The first edition, published today, is awesome, and I’m not saying this because it was my impression with this kind of awesomeness that took both of us out of the relationship and marriage market. Sample this:

You could just go onto a dating/ matrimonial website, set your preferences and end up with 100s of matches. But it mostly only worked for NRI boys in the US trying to find domestic help from India. It wasn’t too common for resident Indians to find each other on these sites in the 90s.

Or:

They’re the types who’ll be too shy to tell you they met their partner on one of these sites as if it was admission of failure. They’ll pretend like they “dated” for a while and got married, but truly, there’s no way to know who they really dated – the spouse, the parents or the random relative who created their profile on these sites.

You can subscribe to the newsletter here. I’m told this will go out once a week.

Moon mode for home

Whenever I’m in a meeting I put my phone on “moon mode”, where all notifications are turned off. If someone has to get in touch with me, they need to call twice in quick succession for my phone to buzz and alert me. The moon mode is automatically switched on every night at 10pm, and notifications are turned off until 6 am.

In fact, in the night, another mode called “screen time” is operational, where I’m not allowed to open any apps apart from the ones I’ve explicitly permitted. This includes the clock (for alarm), Google Maps (in case I’m out) and Spotify and Amazon Music (for my lullabies).

In fact, Screen Time is so strict that any notifications I might have got (overnight mails or messages) are not displayed on the home screen until 6am. This way, in case I wake up in the middle of the night and look at my phone to see the time, I don’t end up seeing something that might cause anxiety.

This is all good in the virtual world, but I need to install something like this for home. Again the purposes are similar to the moon mode that I use on my phone.

Firstly, the wife and I use the home as our offices, and don’t want to be disturbed here. Sundry people, including relatives and friends, assume that since we’re at home all the time we are unemployed and they can drop in any time. And when we’re working, we want the “home moon mode” on so that the doorbell doesn’t ring.

Secondly, in our two years in London, we got enamoured by the Western practice of putting kids to bed early, and despite massive difficulties, we’ve been attempting to do the same here. Like last night the daughter was asleep by 7:20.

And it is critical that (especially) while we are putting her to bed, and when she is asleep, the doorbell doesn’t ring. And since 7pm is an unusual time for kids to be put to bed in India, the doorbell continues to buzz. And of course we don’t want the doorbell to buzz after we’ve gone to bed either.

In short, we need a “moon mode” for home. The simplest solution would be to get a doorbell that can be turned on and off at will (right now it’s a bit high up and out of reach, but should be able to manage that). That works for the time when we’re in meetings or working at home or other wise busy, but it might be a pain to remember to turn it off every night (and turn it on in the morning).

So I’m wondering if we should get a doorbell that is connected to an app, where we can set times of day when it is automatically on and off (with the ability to override).

Then again I don’t want to give my data to some random company (and I’m a bit spooked by hacking of random internet-connected devices), so I might end up going for a simpler solution – an “offline device” which I can hopefully program to go on and off at certain times, and maybe change tune for the night!

Now to find such a device.

Freebies and Misers

Recently the wife and I were having a bitching session about some of our relatives and friends, about how despite being rather wealthy they’re rather miserly, both in terms of spending on themselves and spending on others.

While we were wondering why people with so much money are so stingy, the wife noticed a pattern – they are all people who are used to getting freebies in their professional lives.

There are consultants on expense accounts whose every expense on tour is paid for by their clients. There are doctors who are routinely provided “expense accounts” by medical representatives. There are people who work for the government who get a lot of “perks” in addition to the (rather meagre) salaries they make. There are journalists, who when on PR jaunts, are again used to living on an expense account.

The point with all of them is that they are so used to getting others to pay for their expenses that they are simply not used to spending themselves. And so when it is time for them to spend, they spend like they used to in the time before any of these expense account taps opened up for them.

This, for most of the above referred to people, refers to time when they were either students or they were entry level employees – times when they didn’t have much money in life at all. And they end up living the rest of their non-professional (non-expense account) lives spending like they used to as students or entry level employees.

Back when I was a banker making lots of money, I remember having this conversation with a then medical student who was excited that once she became a “big doctor” she would have medical representatives at her beck and call, who would fund her life. I had replied that I would rather make all my money in cash and have the discretion on what I wanted to spend on, rather than have someone else make the decisions on what I should be splurging on.

I guess there are other benefits as well to spending your own money, rather than living on an expense account.

PS: I just remembered that I haven’t “filed expenses” to my client for a business trip I took a couple of weeks ago.

More on statistics and machine learning

I’m thinking of a client problem right now, and I thought that something that we need to predict can be modelled as a function of a few other things that we will know.

Initially I was thinking about it from the machine learning perspective, and my thought process went “this can be modelled as a function of X, Y and Z. Once this is modelled, then we can use X, Y and Z to predict this going forward”.

And then a minute later I context switched into the statistical way of thinking. And now my thinking went “I think this can be modelled as a function of X, Y and Z. Let me build a quick model to see if the goodness of fit, and whether a signal actually exists”.

Now this might reflect my own biases, and my own processes for learning to do statistics and machine learning, but one important difference I find is that in statistics you are concerned about the goodness of fit, and whether there is a “signal” at all.

While in machine learning as well we look at what the predictive ability is (area under ROC curve and all that), there is a bit of delay in the process between the time we model and the time we look for the goodness of fit. What this means is that sometimes we can get a bit too certain about the models that we want to build without thinking if in the first place they make sense and there’s a signal in that.

For example, in the machine learning world, the concept of R Square is not defined for regression –  the only thing that matters is how well you can predict out of sample. So while you’re building the regression (machine learning) model, you don’t have immediate feedback on what to include and what to exclude and whether there is a signal.

I must remind you that machine learning methods are typically used when we are dealing with really high dimensional data, and where the signal usually exists in the interplay between explanatory variables rather than in a single explanatory variable. Statistics, on the other hand, is used more for low dimensional problems where each variable has reasonable predictive power by itself.

It is possibly a quirk of how the two disciplines are practiced that statistics people are inherently more sceptical about the existence of signal, and machine learning guys are more certain that their model makes sense.

What do you think?

Studs and fighters: Origin

As far as this blog is concerned, the concept of studs and fighters began sometime in 2007, when I wrote the canonical blog post on the topic. Since then the topic has been much used and abused.

Recently, though, I remembered when I had first come across the concept of studs and fighters. This goes way back to 1999, and has its origins in a conversation with two people who I consider as among the studdest people I’ve ever met (they’re both now professors at highly reputed universities).

We were on a day-long train journey, and were discussing people we had spent a considerable amount of time with over the previous one month. It was a general gossip session, the sort that was common to train journeys in the days before smartphones made people insular.

While discussing about one guy we had met, one of us (it wasn’t me for sure. It was one of the other two but I now can’t recall which of them it was) said “well, he isn’t particularly clever, but he is a very hard worker for sure”.

And so over time this distinction got institutionalised, first in my head and then in the heads of all my readers. There were two ways to be good at something – by either being clever or by being a very hard worker.

Thinking about it now, it seems rather inevitable that the concept that would become studs and fighters came about in the middle of a conversation among studs.

News

I wake up early on weekdays nowadays, so go the first two hours of the day without really knowing what is happening in the world. As you might know, I’m on a social media break, so that source of news is cut off. And it’s only around 7 am by when a copy of the Business Standard gets delivered to my door.

Until last month, a copy of the Deccan Herald would arrive at home as well, but I stopped it after I found it to be largely useless. A lot of stories in that newspaper were written as they might have been 20 or 30 years ago. There was little distinction between reporting and analysis and opinion. A lot of news couldn’t be simply consumed without the accompanying (and sometimes patronising) opinion.

The Business Standard, which I started reading in 2005, is still a very good paper. The editorials continue to be first-rate (though their quality had dipped in the 2011-14 period). The analysis pieces and columns cover a variety of topics that simply don’t make it to social media (since they aren’t really “sensational”). And the newspaper is “crisp” and quickly tells you what’s going on in India.

For two years, when I lived in London, I lived without a daily newspaper, and it was a struggle. Online newspapers have simply not been able to provide the same kind of product as offline newspapers. And the reason is that online newspapers are “flat” – all the contextualising and prioritising that a dead-tree paper can do is completely absent in the online version.

In a dead-tree newspaper, you know how important a piece of news is based on the page it appears, the size of the headline, the size of the column and so on. Based on where it appears, you know if it is news or analysis or opinion. In case it is opinion, you can easily see who has written it before you “click through” (start reading it). You can easily how big a piece is (and how much of your time it will take) before deciding to invest time in it.

All this is absent from an online newspaper. Check out, for example, the homepage of the Business Standard, that I so fulsomely praised earlier in this post.

It is impossible to know what’s the important stuff here. If I have only five minutes to read, I don’t know what to focus on. I don’t know which of this is opinion and which is news. Before I click through, I don’t know how big a piece is or who has written it or if it has been syndicated.

Unless the link has come from a qualified source (such as Twitter) I don’t know much about it, and so don’t know how to consume it. This might explain to you why a lot of online news sources are losing revenues to the likes of Google or Facebook – the latter do the important job of putting the news in context!

Finally, I’m glad I now consume news only once a day (from the physical paper). Sometimes, what is news intra-day would have ceased to be news by nightfall. So when you consume news at a reasonable interval (such as daily), what comes to you is “qualified” real stuff. A piece of news should have been important enough for a day to make it to the next day’s newspapers. And once a day is also a reasonable interval to get to know of what is happening in the world.

Data, football and astrology

Jonathan Wilson has an amusing article on data and football, and how many data-oriented managers in football have also been incredibly superstitious.

This is in response to BT Sport’s (one of the UK broadcasters of the Premier League) announcement of it’s “Unscripted” promotion where “some of the world’s foremost experts in both sports and artificial intelligence to produce a groundbreaking prophecy of the forthcoming season”.

Wilson writes:

I was reminded also of the 1982 film adaptation of Agatha Christie’s 1939 novel Murder is Easy in which a computer scientist played by Bill Bixby enters the details of the case into a programme he has coded to give the name of the murderer. As it turns out, the programmer knows this is nonsense and is merely trying to gauge the reaction of the heroine, played by Lesley-Anne Down, when her name flashes on the screen.

But this, of course, is not what data-based analysis is for. Its predictive element deals in probability not prophecy. It is not possessed of some oracular genius. (That said, it is an intriguing metaphysical question: what if you had all the data, not just ability and fitness, but every detail of players’ diet, relationships and mental state, the angle of blades of grass on the pitch, an assessment of how the breathing of fans affected air flow in the stadium … would the game’s course then be inevitable?)

This reminded me of my own piece that I wrote last year about how data science “is simply the new astrology“.

Gaming and social media

Now that I’m off social media I’m wondering if it’s a good idea to replace it with gaming.

Basically I need a fix. I have this need to be distracted/”accelerated” all the time. And I need something that I can go to, get distracted for a little bit and then come back to work.

Sometimes a text message conversation in the middle of work does the trick, but not all conversations are good for this (some people are just too quick in replying and end up stressing me). Also there’s the issue of finding someone to talk to when you need to.

Over the last decade or so I’ve turned to social media. Twitter with its stream of tweets provides good stimulation. The problem with that is that it results in a lot of mental bandwidth going wasted due to massive context switching that happens. Yes it provides distraction but is also tiring.

So I’m wondering if I should try out gaming. I know it is addictive, which is both good and bad (addictions are good when you are trying to take your mind off something). And gaming sure provides the “kick” though I’m not sure if it will be as taxing on the mind as social media. The key, however, is to find a game that can be played in short bursts whenever I need that “kick”.

What do you think? Is there a game that I can use for my distractions, or do you think starting gaming might unleash a different beast that might be hard to tame?

Television and interior design

One of the most under-rated developments in the world of architecture and interior design has been the rise of the flat-screen television. Its earlier avatar, the Cathode Ray Tube version, was big and bulky, and needed special arrangements to keep. One solution was to keep it in corners. Another was to have purpose-built deep “TV cabinets” into which these big screens would go.

In the house that I grew up in, there was a purpose-built corner to keep our televisions. Later on in life, we got a television cabinet to put in that place, that housed the television, music system, VCR and a host of other things.

For the last decade, which has largely coincided with the time when flat-screen LCD/LED TVs have replaced their CRT variations, I’ve seen various tenants struggle to find a good spot for the TVs. For the corner is too inelegant for the flat screen television – it needs to be placed flat against the middle of a large wall.

When the flat screen TV replaced the CRT TV, out went the bulky “TV cabinets” and in came the “console” – a short table on which you kept the TV, and below which you kept the accompanying accessories such as the “set top box” and DVD player. We had even got a purpose-built TV console with a drawer to store DVDs in.

Four years later, we’d dispensed with our DVD player (at a time when my wife’s job involved selling DVDs and CDs, we had no device at home that could play any of these storage devices!). And now we have “cut the cord”. After we returned to India earlier this year, we decided to not get cable TV, relying on streaming through our Fire stick instead.

And this heralds the next phase in which television drives interior design.

In the early days of flat screen TVs, it became common for people to “wall mount” them. This was usually a space-saving device, though people still needed a sort of console to store input devices such as set top boxes and DVD players.

Now, with the cable having been cut and DVD player not that common, wall mounting doesn’t make sense at all. For with WiFi-based streaming devices, the TV is now truly mobile.

In the last couple of months, the TV has nominally resided in our living room, but we’ve frequently taken it to whichever room we wanted to watch it in. All that we need to move the TV is a table to keep it on, and a pair of plug points to plug in the TV and the fire stick.

In our latest home reorganisation we’ve even dispensed with a permanent home for the TV in the living room, thus radically altering its design and creating more space (the default location of the TV now is in the study). The TV console doesn’t make any sense, and has been temporarily converted into a shoe rack. And the TV moves from room to room (it’s not that heavy, either), depending on where we want to watch it.

When the CRT TV gave way to the flat screen, architects responded by creating spaces where TVs could be put in the middle of a long wall, either mounted on the wall or kept on a console. That the TV’s position in the house changed meant that the overall architecture of houses changed as well.

Now it will be interesting to see what large-scale architectural changes get driven by cord-cutting and the realisation that the TV is essentially a mobile device.