Creating my first Tableau web data connector : part 3

At last, the final part of the trials and tribulations of creating my first Tableau Web Data Connector… Part 1 went through the pre-requisites, and building what I might generously term my “user interface”. Part 2 was a struggle against the forces of web security. And in this part, we battle against the data itself, until the error-message dragon is truly slayed.

So, the in the last part, we ended up with a connector that could ask for the username of a BoardGameGeek user, connect to a CORS-enabled version of the API, and complain that whatever objects it found “are not valid as a React child”.

That makes sense, as I didn’t yet tell it what sort of data objects to expect.

As I’m following the Tableau web data connector tutorial, which accesses data from URLs like this one, I figured I’d look at what data that URL returns precisely, compare it to the web data connector tutorial code, and then, when I understand the relationship between the code and the data from the Yahoo URL, I might be able to adapt the code to fit the format my chosen data arrives in.

Here’s the data one gets if you point your web browser to the URL that the tutorial is teaching you how to build a WDC from:

Capture

OK, stock-pickers will immediately recognise that it’s stock quote data, with dates, highs, lowers, symbols and all the other lovely gubbins that comes along with basic stock charts. It’s in JSON format, which is human-readable, but only in small doses.

So I need to understand the relationship between the above data, and this piece of code:

if (data.query.results) {
              var quotes = data.query.results.quote;
              var ii;
              for (ii = 0; ii < quotes.length; ++ii) {
                  var entry = {'Ticker': quotes[ii].Symbol,
                               'Day': quotes[ii].Date,
                               'Close': quotes[ii].Close};
                  dataToReturn.push(entry);
              }

I can kind of see it by eye – there’s obviously JSON entries for Symbol, Date, Close, query and so on. But can’t we make it easier?

Yes we can, because there’s such a thing as a web JSON viewer. Past all that stock-related text into something like http://www.jsoneditoronline.org/ and you get a nice hierarchical visualisation of the JSON structure, like this:

Capture

So if we assume that “data” is referring to the overall set of data itself, and use dot notation to traverse the hierarchy, we can see a line in the code that says:

var quotes = data.query.results.quote;

That would seem to fit into the JSON structure above, where below the “object” level you have a level called query, a sublevel called results, and a sublevel called quote.

Capture

The variables “quotes” that the code creates therefore is basically referring to everything at/below the “quote” level of the JSON hierarchy.

Then you get a bunch of records which are the data of interest itself. These are numbered 0 for the first datapoint, 1 for the next, and so on.

If you know a bit of basic programming, you might note that the WDC code has “for loop” with a counter variable called “ii” that is set to 0 at first and is incrementing by one each time it runs. That seems to fit nicely in with the idea of it iterating through datapoint 0, 1, …n until it gets to the end (which is effectively the length of the quotes dataset, i.e. quotes.length).

Each one of these JSON records then has a few attributes – and they include the ones mentioned in the code as being part of whichever “quotes” subrecord we’re on; “Symbol”, “Date”, “Close”.

   var entry = {'Ticker': quotes[ii].Symbol,
                               'Day': quotes[ii].Date,
                               'Close': quotes[ii].Close};

Capture

Awesome. So where does “Ticker”, “Day” and “Close” on the left hand side of those code lines come from?

Well, that’s up to us. Remember the “add code for defining columns” part of the tutorial?

myConnector.getColumnHeaders = function() {
    var fieldNames = ['Ticker', 'Day', 'Close'];
    var fieldTypes = ['string', 'date', 'float'];
    tableau.headersCallback(fieldNames, fieldTypes);
}

There they’ve defined the fieldnames they want to have Tableau display (which can be totally different from the JSON names we saw above in the datasource) and the field types.

Field types can be any from this list:

  • bool
  • date
  • datetime
  • float
  • int
  • string

So, let’s look at what’s returned when I make the call to my Boardgames datasource.

Capture.PNG

OK, I can see some fields in there, but it’s kind of messy. So I pasted it back into the online JSON editor.

Here’s what the relevant part looks like when formatted nicely:

Capture.PNG

So, the interesting information there is probably the boardgame’s name, year published, URLs to its image and thumbnail, and the various statuses (own, want, etc.) and the number of times its been played (numplays).

I therefore edited the “getColumnHeaders” code snippet to reflect the fields I wanted, and the type of data that would feature in them.

myConnector.getColumnHeaders = function() {
		var fieldNames = ['BoardgameName','YearPublished','ImageURL','ImageThumbnailURL', 'Plays','Owned','PreviouslyOwned','ForTrade','Want','WantToPlay','WantToBuy','Wishlist','Preordered','LastModified'];
		var fieldTypes = ['string','int','string','string','int','int','int','int','int','int','int','int','int','datetime'];
		tableau.headersCallback(fieldNames, fieldTypes);
	}

Now I’ve defined the fields, I can go back to the retrieving results section of code (if (data.query.results)…) and, now knowing the structure of the JSON generated by my API, parse out and assign to the above variables the data I want.

I decided to call the collection of data I was building “games” rather than “quotes”, because that’s what it is. I next noted that each individual “game” within the JSON is listed in a hierarchy below the “item” entry which itself is below “items”.

Capture.PNG

(“Amyitis” is the name of a boardgame, rather than an allergy to people called Amy, believe it or not).

So, I assigned all the items.item data to “games”

if (data.items.item) {
var games = data.items.item;

And copied the tutorial to loop through all the “items.item” entries, each of which is a boardgame, until we’d reached the end i.e. when the number of times we looped is the same as the length of the data table..

var ii;
for (ii = 0; ii < games.length; ++ii) {

Finally, it’s time to assign the relevant bit of data returned from the API to the variables I’d defined above.

Capture

At first I got a little confused, because the JSON output had a bunch of _ and $ entries that didn’t seem similar to what was returned in the tutorial dataset. But it turns out that’s nothing to worry about. Just treat them as though they were any other text.

var entry = {'BoardgameName': games[ii].name[0]._,
'YearPublished': games[ii].yearpublished[0],
'ImageURL': 'https:' + games[ii].image[0],
'ImageThumbnailURL': 'https:' + games[ii].thumbnail[0],
'Plays': games[ii].numplays[0],
'Owned': games[ii].status[0].$.own,
'PreviouslyOwned': games[ii].status[0].$.prevowned,
'ForTrade': games[ii].status[0].$.fortrade,
'Want': games[ii].status[0].$.want,
'WantToPlay': games[ii].status[0].$.wanttoplay,
'WantToBuy': games[ii].status[0].$.wanttobuy,
'Wishlist': games[ii].status[0].$.wishlist,
'Preordered': games[ii].status[0].$.preordered,
'LastModified': games[ii].status[0].$.lastmodified
};
dataToReturn.push(entry);

In the above, you can think about games[ii] as being the individual boardgame record. The “for loop” we defined above substitutes each record into the ‘ii’ variable, so it’s accessing games[0], games[1] etc. which translates into data.items.item[0] and so on, if you remember how we defined the games variable above.

Then to find the boardgame’s name we need to traverse into the “name” chid of the boardgame itself, then look for the first entry below that (always the first one here, so we can refer to that as entry [0]), and look for the field that is shown as an underscore, _.

Capture.PNG

 

Hence:

'BoardgameName': games[ii].name[0]._,

Rinse and repeat this for each element of interest, and you’ve collected your dataset ready for Tableau to use!

Of course, the world is not perfect, and, in reality, I did not manage to do the above without making the odd mistake here and there. I got blank fields sometimes, when I knew they shouldn’t be, or fields with the wrong data in. As I was just using Notepad and the online simulator, I wasn’t really getting many useful error messages.

Lesson learned: you can use the code

tableau.log("YOUR MESSAGE HERE");

to display the message you write in the Google Chrome developer console. You might remember from part 2 that you can bring that up by pressing Ctrl Shift I in Google Chrome, and selecting the “Console” tab.

Why is that useful? Well, one, if it displays your message then it means that piece of code ran, so you can check you’re not skipping over any important section. And secondly, you can append text stored in variables to it. So for instead I could write:

tableau.log("the name of the game is " + games[ii].name[0]._);

And, if my code is correct, it should print out the name of the boardgame as it runs.

If I messed up, perhaps using the wrong index, or forgetting the underscore, it will output something different, and perhaps also an error message, guiding me towards the source of the problem.

That, as they say, is basically that! I ran it through the hosted Tableau Web Data Connector simulator again,  this ticking the box so it will “automatically continue to data gather phase”. And, lo and behold, below you can see that it has interpreted the information about that famous Amyitis game into a nice tabular format.

Capture.PNG

Once you’ve got to that stage, you’ve already won. Just load up Tableau Desktop for real, select to connect to a web data connector and up should pop your interface, and later, your data.

 

Creating my first Tableau web data connector – part 2

Now for the next part of the creating-a-web-data-connector journey: previously, I had got all the software I needed to create a web data connector installed (well, to be fair Notepad comes with Windows so that wasn’t hard) and designed my user interface.

Now it was time to move on to Tableau tutorial section 5 – using the information retrieved from the user of my UI in order to have Tableau connect to the BoardGameGeek API and retrieve the relevant data for analysis.

I read through the Tableau WDC tutorial, and, using some previous programming-other-stuff knowledge, realised that this tutorial was a mix of explaining the fundamentals of connecting to a generic web data connector, and explaining some very specific details of the example Yahoo finance web data connector the example showed.

The first thing to do then was to determine which bits were essential for my simple attempt and which were only there to facilitate the specific Yahoo API.

  • The “code for defining columns” section is obviously important. I need columns to put my data in.
  • The Yahoo Query YQL less so – my API is nothing to do with Yahoo.
  • The helper functions (buildUri, getFormattedDate,makeTwoDigits), no, these are just functions to help construct the YQL query needed to access their example API. It’s great to know that it’s possible to use helper functions like this if needed, but chances are, if I did need some, they’d not be the same unless my API was very similar to their Yahoo example.
  • The “code to get stock data”, well, I don’t want stock data, but I do want some sort of data so obviously some of that’s going to be relevant, so I pasted that into my file as-is, noting from the code that some of it was clearly going to need to be replaced given I don’t want boring finance data when exciting boardgames data is available instead.

It seemed obvious that providing the weblink to my API of choice -http://boardgamegeek.com/xmlapi2/collection?username=[whatever] – was going to be key.  So I replaced their connectionURI  code with a simple text variable of that, using the user-supplied BGG username I had got from the previous part.

connectionUri = 'http://boardgamegeek.com/xmlapi2/collection?username=' + bggUsername

 

(a URI is a Uniform Resource Identifier)

Of course this wasn’t going to be enough to make it work – I have not yet told Tableau what data I want it to receive and how to format it – but figured I’d run it through the simulator and see what happened. Nothing especially good did. But I did learn about “CORS”.

Now, again, no particular error messages tend to pop up by default even when things go fatally wrong in the simulator . It might look like it hung, it might go blank or similar, but no friendly message.

Lesson learned: If you are working with the super-basic tools I had at hand, what you need then is the built in developer tools Chrome has which you can bring up by pressing Ctrl Shift + I together. Switch it to show the “Console” tab, and you might see an error message there to help you out, if you’re lucky.

Here’s what I saw:

Untitled picture

Now, it’s been a long time since I did any serious web developing, and I had never suffered an Access-Control-Allow-Origin problem before. A swift bit of Googling led me to this nice explanation where I learnt that it’s a security feature “implemented in browsers to restrict interaction between documents (or scripts) that have different origins”. That makes sense; I don’t own either the BoardGameGeek site nor Github, I’m sad to say.

Of the 3 solutions that Jvaneyck suggests, it seemed like if I could find a JSONP source for my API instead of the XML one then that would be simplest, given I have no control over either server nor intention of building any infrastructure. So, can I get the BoardGameGeek API in JSONP format? Well, another quick Google ™ let me to a post on the BoardGameGeek forum, where a kind user named strathmeyer had offered up “Ajax-ready BGG API for any web programmers out there”.

The best thing is that I’ve enabled CORS headers, so you can use the API straight from the web browser

Yes, here indeed that was the best thing! Access-Control-Allow-Origin issue solved, and all thanks to the work of someone else 🙂

Lesson learned: I later found that Tableau 1) has a whole page dedicated to this exact problem, and 2) there’s an example in their SDK of various methods to get around this issue when connecting to XML. So next time, I should probably RT original M before going Google crazy.

But for now, all I needed to do was change the URI I pointed to from my web data connector to:

connectionUri = 'http://bgg-api.herokuapp.com/api/v1/collection?username=' + bggUsername

I ran that through the simulator once more, of course not expecting it to work before I had defined what data I was actually after – but it did eliminate the previous error message so I knew things were moving on nicely.

So when it ran through, Ctrl Shift + I back into Chrome to bring up the inspector and saw this message.

Untitled picture

My variable is not defined?! Where did it go? Oops, my fault, I’d embedded a previous function in this one. I needed to be more careful with my bracket-pasting finger. But it was simplicity itself to move the code back to where it should be, and re-simulate.

 

Untitled picture

OK, “uncaught invariant violation” is perhaps a little cryptic, but it’s sounds like it’s talking about having found something it didn’t expect. That could be my data (spoiler: it was). So now I needed to tell it what to expect and what to do with it.

Tune back in, same place, different time, for the exciting conclusion!

 

The Tableau #MakeoverMonday doesn’t need to be complicated

For a while, a couple of  key members of the insatiably effervescent Tableau community, Andy Cotgreave and Andy Kriebel, have been running a “Makeover Monday” activity. Read more and get involved here – but a simplistic summary would be that they distribute a nicely processed dataset on a topic of the day that relates to someone else’s existing visualisation, and all the rest of us Tableau fans can have a go at making our own chart, dashboard or similar to share back with the community so we can inspire and learn from each other.

It’s a great idea, and generates a whole bunch of interesting entries each week. But Andy K noticed that each Monday’s dataset was getting way more downloads than the number of charts later uploaded, and opened a discussion as to why.

There are of course many possible reasons, but one that came through strongly was that, whilst they were interested in the principle, people didn’t think they had the time to produce something comparable to some of the masterpieces that frequent the submissions. That’s a sentiment I wholeheartedly agree with, and, in retrospect – albeit subconsciously – why I never gave it a go myself.

Chris Love, someone who likely interacts with far more Tableau users than most of us do, makes the same point in his post on the benefits of Keeping It Simple Stupid. I believe it was written before the current MakeoverMonday discussions began in earnest, but was certainly very prescient in its applications to this question.

Despite this awesome community many new users I speak to are often put off sharing their work because of the high level of vizzes out there. They worry their work simply isn’t up to scratch because it doesn’t offer the same level of complexity.

 

To be clear, the original Makeover Monday guidelines did include the guideline that it was quite proper to just spend an hour fiddling around with it. But firstly, after a hard day battling against the dark forces of poor data quality and data-free decisions at work, it can be a struggle to keep on trucking for another hour, however fun it would be in other contexts.

And that’s if you can persuade your family that they should let you keep tapping away for another hour doing what, from the outside, looks kind of like you forgot to finish work. In fact a lot of the worship I have for the zens is how they fit what they do into their lives.

But, beyond that, an hour is not going to be enough to “compete” with the best of what you see other people doing in terms of presentation quality.

I like to think I’m quite adept with Tableau (hey, I have a qualification and everything :-)), but I doubt I could create and validate something like this beauty using an unfamiliar dataset on an unfamiliar topic in under an hour.

 

It’s beautiful; the authors of this and many other Monday Makeovers clearly have an immense amount of skill and vision. It is fascinating to see both the design ideas and technical implementation required to coerce Tableau into doing certain non-native things. I love seeing this stuff, and very much hope it continues.

But if one is not prepared to commit the sort of time needed to do that regularly to this activity, then one has to try and get over the psychological difficulty of sharing a piece of work which one perceives is likely to be thought of as “worse” than what’s already there. This is through no fault of the MakeoverMonday chiefs, who make it very clear that producing a NYT infographic each week is not the aim here – but I certainly see why it’s a deterrent from more of the data-downloaders uploading their work. And it’s great to see that topic being directly addressed.

After all, for those of us who use Tableau for the day-to-day joys of business, we probably don’t rush off and produce something like this wonderful piece every time some product owner comes along to ask us an “urgent” question.

Instead, we spend a few minutes making a line chart, that gives them some insight into the answer to their question. We upload an interactive bar chart, with default Tableau colours and fonts, to let them explore a bit deeper and so on. We sit in a meeting and dynamically provide an answer to enable live decision-making that before we had tools like this would have had to wait a couple of weeks to get a csv report on. Real value is generated, and people are sometimes even impressed, despite the fact that we didn’t include hand-drawn iconography, gradient-filled with the company colours.

Something like this perhaps:

Yes, it’s “simple”, it’s unlikely to go Tableau-viral, but it makes a key story held within that data very clear to see. And its far more typical of the day-to-day Tableau use I see in the workplace.

For the average business question, we probably do not spend a few hours researching and designing a beautiful colour scheme in order to perform the underlying maths needed to make a dashboard combining a hexmap, a Sankey chart and a network graph in a tool that is not primarily designed to do any of those things directly.

No-one doubts that you can cajole Tableau into such artistry, and there is sometimes real value obtainable by doing so,  or that those who carry it out may be creative geniuses -but unless they have a day job that is very different than that of mine and my colleagues, then I suspect it’s not their day-to-day either. It’s probably more an expression of their talent and passion for the Tableau product.

Pragmatically, if I need to make, for instance, a quick network chart for “business”, then, all other things being equal, I’m afraid I’m more likely I get out a tool that’s designed to do that rather than take a bit more time to work out how to implement it in Tableau, no matter how much I love it (by the way, Gephi is my tool of choice for that – it is nowhere near as user friendly as Tableau, but it is specifically designed for that sort of graph visualisation; also recent versions of Alteryx can do the basics). Honestly, it’s rare for me that these more unusual charts need to be part of a standard dashboard; our organisation is simply not at a level of viz-maturity where these diagrams are the most useful for most people in the intended audience, if indeed they are for many organisations.

And if you’re a professional whose job is creating awesome newspaper style infographics, then I suspect that you’re not using Tableau as the tool that provides the final output either, more often than not. That’s not its key strength in my view; that’s not how they sell it – although they are justly proud of the design-thought that does go into the software in general. But if paper-WSJ is your target audience, you might be better of using a more custom design-focused tool, like Adobe Illustrator (and Coursera will teach you that specific use-case, if you’re interested).

I hope nothing here will cause offence. I do understand the excitement and admire anyone’s efforts to push the boundaries of the tool – I have done so myself, spending way more time than is strictly speaking necessary in terms of a theoretical metric of “insights generated per hour” to make something that looks cool, whether in or out of work. For a certain kind of person it’s fun, it is a nice challenge, it’s a change from a blue line on top of an orange line, and sometimes it might even produce a revelation that really does change the world in some way.

This work surely needs to be done; adherents to (a bastardised version of) Thomas Kuhn’s theory of scientific revolutions might even claim this “pushing to the limits” as one of the ways of engendering the mini-crisis necessary to drive forward real progress in the field. I’m sure some of the valuable Tableau “ideas“, that feed the development of the software in part, have come from people pushing the envelope, finding value, and realising there should be an easier way to generate it.

There’s also the issue of engagement: depending on your aim, optimising your work for being shared worldwide may be more important to you than optimising it for efficiency, or even clarity and accuracy. This may sound like heresy, and it may even touch on ethical issues, but I suspect a survey of the most well-known visualisations outside of the data community would reveal a discontinuity with the ideals of Stephen Few et al!

But it may also be intimidating to the weary data voyager when deciding whether to participate in these sort of Tableau community activities if it seems like everyone else produces Da Vinci masterpieces on demand.

Now, I can’t prove this with data right now, sorry, but I just think it cannot be the case. You may see a lot of fancy and amazing things on the internet – but that’s the nature of how stuff gets shared around; it’s a key component of virality. If you create a default line chart, it may actually be the best answer to a given question, but outside a small community who is actively interested in the subject domain at hand, it’s not necessarily going to get much notice. I mean, you could probably find someone who made a Very Good Decision based even on those ghastly Excel 2003 default charts with the horrendous grey background if you try hard enough.

excel2003

Never forget…

 

So, anyway, time to put my money where my mouth is and actually participate in MakeoverMonday. I don’t need to spend even an hour making something if I don’t want to, right?  (after all, I’ve used up all my time writing the above!)

Tableau is sold with emphasis on its speed of data sense-marking, claiming to enable producing something reasonably intelligible 10-100x faster than other tools. If we buy into that hype, then spending 10 minutes of Tableau time (necessitating making 1 less cup of tea perhaps) should enable me to produce something that it could have taken up to 17 hours to produce in Excel.

OK, that might be pushing the marketing rather too literally, but the point is hopefully clear. For #MakeoverMonday, some people may concentrate on how far can they push Tableau outside of its comfort zone, others may focus on how they can integrate the latest best practice in visual design, whereas here I will concentrate on whether I can make anything intelligible in the time that it takes to wait for a coffee in Starbucks (on a bad day) – the “10 minute” viz.

So here’s my first “baked in just 10 minutes” viz on the latest MakeoverMonday topic – the growth of the population of Bermuda. Nothing fancy, time ran out just as I was changing fonts, but hey, it’s a readable chart that tells you something about the population change in Bermuda over time. Click through for the slightly interactive version – although of course, it, for instance, has the nasty default tooltips, thanks to the 10 minutes running out just as I was changing the font for the chart titles…

Bermuda population growth.png