AWS Lambda: a few years of advancement and we are back to stored procedures

Serverless computing is the new buzzword.

AWS describes Lambda (their implementation of Serverless) as the way how you’ll do things post containers.

Go figure how behind you are if you are head down learning Docker thinking it’s “the next big thing”. Sorry.

In order not to look too legacy, I decided to push on GitHub a small experiment I built last year: that is a super short and simple Python program (that can be run as a Lambda function) that I had used to record (in a DynamoDB table) the status of the vCloud Air login service.

My use case was fairly simple: I did want to have a historical record of the up-time of the vCA login service (which was at that time experiencing some glitches I wanted to track) for trending analysis.

The code to make that happen was fairly trivial but having a VM (running that code) that saved data in a data base (running in the same or in a separate VM) seemed to be the traditional bazooka to shoot a fly considering the requirements.

Enter Lambda.

I had been to the latest two AWS re:Invent events and Lambda (which intrigues me considerably) was always front and center.

I am not sentimentally as involved as Ant Stanley but I, for one, love the idea and the principles behind Lambda. BTW this is a picture of Ant’s latest tattoo:

What’s interesting about Lambda (to me at least) isn’t so much about the fact that “you can focus on your code, upload just the bits, and let the back-end figure out how to run it for you”.

That is, to me, the traditional PaaS value proposition.

What intrigued me about Lambda is the fact that it’s “an extension of your data”. This changes the landscape quite dramatically and substantially.

In traditional PaaS world the code is the indisputable protagonist (oh, damn, and you also happen to need a persistent data service to store those transactions BTW).

With Lambda the data is the indisputable protagonist (oh and you also happen to attach code to it to build some logic around data).

A few years of advancement and we are back to stored procedures.

In all seriousness the Lambda example I have authored for my own testing is the exact opposite of an event driven approach (as described above). At best it’s a way to avoid the bazooka but, for what I wanted to achieve, a PaaS approach would have been more applicable (perhaps).

There also have been a lot of discussions as of late re the risk of being locked-in by abusing Serverless architectures (like Lambda).

There is some truth to it. This, however, isn’t due (too much) by how you write the code from a syntax perspective: while coding my Python program I noticed that when the function is run in the context of Lambda, the platform expects to pass a couple of parameters to the function (“event” and “context”). I had to tweak my original code to include those two inputs (even though I make no use of them in my program).

I am no Lambda expert but my take is that this isn’t a mere copy/paste of your own code into Lambda functions. Some tweaking may be required.

But this is still nothing compared to the level of complexity that will occur when you disintegrate your logic and attach it to data and/or events. At that point re-assembling your code/logic (that has been scattered all over the places) will be a gigantic effort.

So, IMO, the lock-in will not be a function of how different the syntax in your code will be Vs. running it on a platform you control (probably minimally different) but rather in how scattered and interleaved with other services your code will be (at scale).

Perhaps there will be a new meaning for “spaghetti code”?

I am not trying to say you should avoid using Lambda.

I also, for one, think that this whole lock-in thing is just rubbish.

There are so many people that waste so many energies in trying to avoid lock-in that ends up doing nothing: as a matter of fact, standing still is a way to avoid lock-in.


P.S. Yes, I know that it’s called “Serverless” but it doesn’t mean “there are no servers involved”. Are we really discussing this?