For somebody who grew up in the 90s it's easy to forget that computers are not the only things that can "compute". You can do equivalent processes with cellular automata or even DNA. More relevantly, you can also do it with smartphones, microwaves and electric watches; all of these things are Turing complete. And for a given task it's not ultimately relevant whether it was done on a "computer".
Working with Amazon has made me wonder; are computers the right way to think about computation? Come to think of it, even virtual computers come with a lot of baggage, like operating systems and task schedulers, that isn't relevant to whatever you're using the computer for. On some level what I really want is a computational *process*, and I really don't care whether it's being done with Windows or Linux, on one machine or several, or whether it's just some person with a very fast abacus.
I suspect that Amazon's "workflows" are the first step toward a post-computer world. The workflows do still run on virtual computers, but even those are fast becoming vestigial organs. Tools like Microsoft's CLI already let us write programs without thinking about hardware, and the next step is to dispense with computers entirely, so that we just think about "processes" that happen out in the cloud. Of course the computers will still be there - the computation does have to run on *something* - but they will be absent from the programmer's mind.
And of course, like most new developments, this one will actually just be re-discovering something very old. Before the first transistor was ever made, mathematicians had already formalized what it meant to do "computation". Turing machines and lambda calculus both describe a completely abstract notation of computation, and maybe the future will bring us back full circle.