I normally think of the impact of the current wave of automation in terms of the level of skill/knowledge it takes to have a job that can't be automated. This article here pointed me toward a different approach.
Instead of simply thinking of the level of non-automated jobs, just think of the efficiency benefit you get from a skilled/trained person using a certain tool (including computers and robots) versus someone who isn't skilled or trained using older technology. That ratio of efficiency basically sets pay scales. So instead of looking at the capabilities of non-automated jobs, we simply look at the skills required to do jobs after they have been highly automated and whether it even makes sense to have anyone do anything the old way.
In one of the comments on the article someone argues that the same type of article could have been written about plows, sewing machines, or other historical advances. I think this comment is a little oversimplified. One of the responses to it brings this out by asking how much more efficient is a person who has mastered the plow compared to someone who hasn't mastered it and uses it poorly. Compare that to how much a programming guru can accomplish relative to a complete novice. That ratio is very significant and we'll come back to it later.
I think we can dig deeper still into this analogy. (Pun intended.) How long does someone have to work with a plow to get within a certain fraction (say 25%) of a master? How much more efficient is the master with the plow compared to a person with just a spade? Now go to the modern equivalent. How long do you have to work with a computer to be able to be 25% as efficient as a master coder? How efficient would a non-coder be using older technology (not a computer) for various tasks?
I have to admit I don't have much experience with plows, but I'm guessing I could get to 25% of the best in under a year. Actually, my gut tells me a week, but it is quite possible that I'm missing something. As for the plow compared to the spade, it is probably at least a factor of 10 and probably closer to 100. But what about the modern equivalent? Most people who have gone through 4 years of academic study and a few years of professional practice probably still won't get to 25% of a coding guru. As for the efficiency ratio of a skilled computer user on a modern computer compared to any previous technology (like paper pushing), the programmer and computer have to be at least several thousand times faster for the majority of tasks these companies work with.
How does this ratio impact employability and wages? Well, because it is a nice round numbers, let's assume that a skilled person makes $100/hour. An unskilled person is economically if their salary is reduced by the efficiency percentage. So once if they can get to 25% of the efficiency of the skilled person in a reasonably short period of time, they are probably a good bargain for an employer as long as they make under $25/hour. (Ignoring all types of things like taxes, benefits, and cost of infrastructure/equipment to keep things simple.) However, if the unskilled person has an efficiency that is more than 100x lower and it will take them a long time to improve that significantly, they are basically unemployable. For it to be a bargain for the employer they would have to make pennies per hour and would be better of begging or resorting to petty crime.
The argument that previous advances in technology have increased overall productivity and led to new job creation are perfectly accurate and I guess in theory they apply today as well. There is just one problem. I'm pretty sure the efficiency ratio grows exponentially just like technological advancement. It certainly has for decades, if not centuries, of recent time and I fully expect that to hold out another decade or two. This exponential growth isn't slow either. That means that once it starts to break away, it just soars off. Taken to the extreme you can imagine a world where any particular job can be accomplished by a single, skilled person. You don't hire a second one because there isn't a need to. The first one can do it all. That is the extreme, and that might not be realistic, but you don't have to go to that point before many things start to break down.
One of the links in the above article goes to this.
Simply watch the movie and you see a company that has huge volume and very few employees. It doesn't take much imagination to see how you could get rid of most of the people you see doing unskilled work and replace them with different types of robots that are managed by a much smaller number of people. The way this scales you have a company that can handle a huge fraction of the non-perishable goods purchases in the country with very few employees. Such, they buy a bunch of robots, but those robots were built in dark factories. Only the designers and programmers are humans, not the builders. Now introduce self-driving delivery vehicles and things get even more interesting.
Of course, the normal economic model is that this all creates new jobs. Costs go down and people can buy more. How much training/skill do you need for the jobs it created though? How much stuff do we really need to buy? There is a point of diminishing returns. We might well have already passed it. The normal model of growth has been fueled by growing populations and growing wealth. I see that breaking down. No exponential can go forever, even at low rates. (To see this, simply calculate what happens with 1% growth per year for a few thousand years.) So we have a race of exponential growths here. Which one crashes out first and to what end?