Tinkering with Transposit (beta) and AirTable
Transposit is an interesting new service that aims to help when you're trying to connect various apps (or, more specifically, their APIs) together. But it does it in a novel way: it lets you query various data sources as if they were SQL objects.
That is an ingenious idea: most of us already know SQL and it maps quite well to what we're trying to achieve with many of the APIs.
To illustrate: Transposit lets you treat various services as SQL schema and query them; not only that, it lets you perform joins and other operations and mix them with Javascript and other useful concepts. And yes, you can do joins across services; for instance, I've learned about Transposit from Jess' great blog, where she queries and joins Github issue data with her AirTable 'bases' (that's the name AirTable gives to their separate databases.)
So when I've encountered a data enrichment task I wanted done, I've jumped at the opportunity to try Transposit myself. Unlike Jess in her blog post linked above, I'm only connecting to a single service - AirTable. (I love AirTable. I just wish they had cheaper/non-commercial paid tiers - I'm starting to hit the limits of the free version, but since I'm just using it for personal stuff, I can't justify shelling out $120 a year.)
Enriching AirTable - backstory
I have two AirTable bases that arose separately and in different forms (Word docs, Excel sheets, etc) and went through multiple iterations until they've ended up in AirTable. One of them tracks the video games I'd bought. Another one tracks the games I've actually played and completed.
Since, thanks to things like game bundles, I now have many games I don't even know about, I wanted to be able to navigate my list better; that's why I'd coded up an integration in C# that looks up the games in the table, finds their data in the excellent (and free) IGDB database and store them in AirTable. This way I can filter by things like category without actually having to fill them out manually.
As I'm automating more and more of my data flows, I'd realized I want to be able to match the games I'd finished with my list of games I own. And a good way to do that seemed to be to match both of those to IGDB.
Enter Transposit
Instead of writing more C# code, I'd decided to approach this issue differently: I would match the game names from my Played table with the items in my Owned table, and if they match, I'd copy the IGDB ID over. That way I can tell if they're talking about the same game - and if I feel like it, I might build a future data flow on top, that would take the rows from the Played table and track them as Played in my IGDB profile.
Here is where Transposit comes in. It lets me use SQL queries to join both of those tables, in different bases, by name - and then copy data from one to the other.
Loading data
First things first: let's write separate queries for each table.
When you first add AirTable as a data source, Transposit asks you for your API key. That's the one that will be used when you run your queries manually while writing and testing code. (See below for notes on deployment.)
airtable.get_records
is how I tell Transposit I'm trying to load data from the AirTable service. Other things work as you'd expect in SQL: I can enter conditions that specify the table and base to be used, as well as row filtering criteria. In this case, I only want the rows that actually have a game from IGDB matched to them - i.e. where the reference IGDB ID is more than zero.
A few things to note here:
- That's how you escape field/column names. Since my IGDB ID column has a space in the name, I have to escape it with backticks.
- Transposit has syntax extensions to let you build or read JSON data. Seems handy, but I haven't needed them here.
- I could have extracted the
baseId
into a parameter that would let other people reuse my query - but since it's for my highly specific case and would be useless for anyone else anyway, I didn't bother.
Similarly for the 'operation' that fetches the list of games I'd played, that weren't yet matched to a game in IGDB (i.e. their ID is 0).
Now for the join.
Just like that. The potential power of this concept is really impressive.
This is what it looks like in the Transposit UI:
That's the JSON form. However, I can also view the results in a table.
At this point I have a table of the records I want to update: it has the game's name, the ID of the row in the Played table, and the IGDB ID I need to store there.
I'm missing one more Transposit operation: the one that'll do the updating. Perhaps a bit counterintuitively, it also is a SELECT statement.
This one is parametrized.
- The
@recordid
parameter sets the record (row) I want to update - The
@igdbid
parameter is the ID itself, i.e. the value I want to store.
The parameters are easily created in the editor.
Tying it together
The last thing I need to do is to make Transposit call an Update (updateFinishedGameIgdbId
) for each row of the matchAirtableFinishedGames
. I kept thinking I should be able to express this in SQL, but haven't found any way to do so. (See below.) So I had to make do with JavaScript.
The code is pretty straightforward.
- The aptly-named
api
object exposes Transposit's functions. - When you're calling the operations you've defined, don't forget the
this
prefix. - The returned value is a JSON object that tells me how many records were processed - more on that later.
Et voilà - now when I press Run, everything works. The calls to api.log
tell me what game is being processed and I can see the updates appear live in AirTable. (Another one of its cool features.)
Deployment
If I just wanted to process these data once and be done with it, I could've stopped there. But since I also like playing around with Huginn, I wanted to see if I can somehow tie them together.
Fortunately, Transposit exposes webhook endpoints, so it's really easy. (I could also run a deployment and schedule it in Transposit to run periodically, but since lots of my processes are already controlled from Huginn, I prefer to let it do the scheduling and just call Transposit when necessary.)
When I create a new Operation, I can select the Webhook type and select the code to be run when it is hit.
Then I notice this warning:
Seems Transposit won't just re-use the API key I've used for testing for a production deployment. This way, when I select I want the webhook to be deployed publically, I have to explicitly specify the API key to be used when the webhook is hit, which is nice.
This is what the entire thing looks like, with the code written and the webhook deployed:
Everything works just as expected - when the deployed webhook's URL is hit, the job runs and enriches the AirTable data. Then it returns the JSON telling me how many records were processed. This information will also be emitted into a Huginn event, letting me pipe it into further processing if I want to.
Minor gripes
All in all, I like the service. The entire concept will be incredibly handy, given how often modern apps are mostly just glue between various services. That being said, there is still a lot of work ahead of them; the documentation is sparse and some features are still missing.
A few of the most irksome: the error reporting doesn't always show what's wrong (sometimes it just says 'Error'). I haven't been able to figure out how to write a WHERE
condition that would test whether an AirTable column is empty. (I've tried and failed, and after wasting enough time on it I've just decided to initialize the column with zeroes. Who knows, maybe it's even there - but not in any docs or samples I've seen.) There is no support for count(*)
- it would've been handy when checking whether my queries work and return expected number of results. Transposit already supports plenty of APIs, but unfortunately, most of those I want are still missing. And last, but definitely not least: the "Commit Code" and "Discard Code" buttons are right next to each other, and the Discard one asks for no confirmation before forever deleting all of your work. That suuuuucks.
Hopefully, all of those will be addressed eventually. In the meantime: the service is great. Check it out.