When co-pilot becomes pilot
May. 2nd, 2024 03:41 pmI'm a bit concerned at how we're throwing around the term "AI" to describe the process of mimicking intelligence starting from random noise and training data.
And at the other end of the pipeline, I'm also concerned at how we use the word "intelligent" for systems that read and visually examine data and recognize various things in it. There is "classification", and there is "curation". These things do the former but not the latter. It still takes a human to do the second.
This is not "intelligence". Whatever intelligence is active in this system is the intelligence that a human is bringing to the table as they manipulate a prompt in a generative tool to get some result, or change the way it classifies subsequent information.
Most of our current concerns are about how a human can - and often does - bring almost no intelligence to this process, and then uses the result to try and fool others.
This is a huge problem in higher education currently. Reforms are needed. And I think we have to frame any reforms by asking, "what’s the goal?" If it’s to make students employable, that suggests one approach... If it’s to give them a liberal-arts-style understanding of themselves, society, and the world, maybe that suggests a different approach. I don't have answers here.
I keep asking myself how these tools can enhance my own work. And mostly what they can do, so far, is make suggestions to scaffold parts of a codebase that aren’t written yet, based on the existing codebase, and a huge library of examples. This is pretty handy, though you still need to rewrite large chunks of it and of course you have to understand all of it.
In less than a year I suspect this is going to be formalized into a process where a software developer has the option of explaining, in writing or out loud, what they want a function to take in, what they want to do with it, and what to return, and the computer will spit out a really high quality first draft of that function. Like, to the point where it might not even need any editing at all. That could be very useful to a person like me, who has enough experience to know what I want, and the ability to describe what I want in reasonably clear human language.
But then, let's take that a step further: Imagine that we now preserve the description - nay, the entire dialogue - that I, the developer, had with the generative tool to create that function. Now imagine that dialogue is preserved for all the functions in the codebase. Now imagine that that dialogue is what we feed into a "compiler" to generate code, in whatever language is appropriate for a platform. And computers run that, and we roll with it, until something goes sideways and only then do we break out debugging and forensics tools.
That’s awfully plausible, and it will change the nature of software development. My resume will no longer list "Python, C, C++, Javascript, Typescript". It will look more like an academic's resume: "I authored this, I was on the development committee for this, I co-authored the spec for that..."
Far from making people like me obsolete, it will make people like me more useful, because I can spend a lot more time looking at overall systems, consulting with people to nail down requirements, and drilling down into code with fancy tools to find bugs only when something goes wrong... All things that require long-term expertise.
But we’re not there yet, of course. Maybe another 2 or 3 years?
And at the other end of the pipeline, I'm also concerned at how we use the word "intelligent" for systems that read and visually examine data and recognize various things in it. There is "classification", and there is "curation". These things do the former but not the latter. It still takes a human to do the second.

Most of our current concerns are about how a human can - and often does - bring almost no intelligence to this process, and then uses the result to try and fool others.
This is a huge problem in higher education currently. Reforms are needed. And I think we have to frame any reforms by asking, "what’s the goal?" If it’s to make students employable, that suggests one approach... If it’s to give them a liberal-arts-style understanding of themselves, society, and the world, maybe that suggests a different approach. I don't have answers here.
I keep asking myself how these tools can enhance my own work. And mostly what they can do, so far, is make suggestions to scaffold parts of a codebase that aren’t written yet, based on the existing codebase, and a huge library of examples. This is pretty handy, though you still need to rewrite large chunks of it and of course you have to understand all of it.
In less than a year I suspect this is going to be formalized into a process where a software developer has the option of explaining, in writing or out loud, what they want a function to take in, what they want to do with it, and what to return, and the computer will spit out a really high quality first draft of that function. Like, to the point where it might not even need any editing at all. That could be very useful to a person like me, who has enough experience to know what I want, and the ability to describe what I want in reasonably clear human language.
But then, let's take that a step further: Imagine that we now preserve the description - nay, the entire dialogue - that I, the developer, had with the generative tool to create that function. Now imagine that dialogue is preserved for all the functions in the codebase. Now imagine that that dialogue is what we feed into a "compiler" to generate code, in whatever language is appropriate for a platform. And computers run that, and we roll with it, until something goes sideways and only then do we break out debugging and forensics tools.
That’s awfully plausible, and it will change the nature of software development. My resume will no longer list "Python, C, C++, Javascript, Typescript". It will look more like an academic's resume: "I authored this, I was on the development committee for this, I co-authored the spec for that..."
Far from making people like me obsolete, it will make people like me more useful, because I can spend a lot more time looking at overall systems, consulting with people to nail down requirements, and drilling down into code with fancy tools to find bugs only when something goes wrong... All things that require long-term expertise.
But we’re not there yet, of course. Maybe another 2 or 3 years?