When co-pilot becomes pilot
May. 2nd, 2024 03:41 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
I'm a bit concerned at how we're throwing around the term "AI" to describe the process of mimicking intelligence starting from random noise and training data.
And at the other end of the pipeline, I'm also concerned at how we use the word "intelligent" for systems that read and visually examine data and recognize various things in it. There is "classification", and there is "curation". These things do the former but not the latter. It still takes a human to do the second.
This is not "intelligence". Whatever intelligence is active in this system is the intelligence that a human is bringing to the table as they manipulate a prompt in a generative tool to get some result, or change the way it classifies subsequent information.
Most of our current concerns are about how a human can - and often does - bring almost no intelligence to this process, and then uses the result to try and fool others.
This is a huge problem in higher education currently. Reforms are needed. And I think we have to frame any reforms by asking, "what’s the goal?" If it’s to make students employable, that suggests one approach... If it’s to give them a liberal-arts-style understanding of themselves, society, and the world, maybe that suggests a different approach. I don't have answers here.
I keep asking myself how these tools can enhance my own work. And mostly what they can do, so far, is make suggestions to scaffold parts of a codebase that aren’t written yet, based on the existing codebase, and a huge library of examples. This is pretty handy, though you still need to rewrite large chunks of it and of course you have to understand all of it.
In less than a year I suspect this is going to be formalized into a process where a software developer has the option of explaining, in writing or out loud, what they want a function to take in, what they want to do with it, and what to return, and the computer will spit out a really high quality first draft of that function. Like, to the point where it might not even need any editing at all. That could be very useful to a person like me, who has enough experience to know what I want, and the ability to describe what I want in reasonably clear human language.
But then, let's take that a step further: Imagine that we now preserve the description - nay, the entire dialogue - that I, the developer, had with the generative tool to create that function. Now imagine that dialogue is preserved for all the functions in the codebase. Now imagine that that dialogue is what we feed into a "compiler" to generate code, in whatever language is appropriate for a platform. And computers run that, and we roll with it, until something goes sideways and only then do we break out debugging and forensics tools.
That’s awfully plausible, and it will change the nature of software development. My resume will no longer list "Python, C, C++, Javascript, Typescript". It will look more like an academic's resume: "I authored this, I was on the development committee for this, I co-authored the spec for that..."
Far from making people like me obsolete, it will make people like me more useful, because I can spend a lot more time looking at overall systems, consulting with people to nail down requirements, and drilling down into code with fancy tools to find bugs only when something goes wrong... All things that require long-term expertise.
But we’re not there yet, of course. Maybe another 2 or 3 years?
And at the other end of the pipeline, I'm also concerned at how we use the word "intelligent" for systems that read and visually examine data and recognize various things in it. There is "classification", and there is "curation". These things do the former but not the latter. It still takes a human to do the second.

Most of our current concerns are about how a human can - and often does - bring almost no intelligence to this process, and then uses the result to try and fool others.
This is a huge problem in higher education currently. Reforms are needed. And I think we have to frame any reforms by asking, "what’s the goal?" If it’s to make students employable, that suggests one approach... If it’s to give them a liberal-arts-style understanding of themselves, society, and the world, maybe that suggests a different approach. I don't have answers here.
I keep asking myself how these tools can enhance my own work. And mostly what they can do, so far, is make suggestions to scaffold parts of a codebase that aren’t written yet, based on the existing codebase, and a huge library of examples. This is pretty handy, though you still need to rewrite large chunks of it and of course you have to understand all of it.
In less than a year I suspect this is going to be formalized into a process where a software developer has the option of explaining, in writing or out loud, what they want a function to take in, what they want to do with it, and what to return, and the computer will spit out a really high quality first draft of that function. Like, to the point where it might not even need any editing at all. That could be very useful to a person like me, who has enough experience to know what I want, and the ability to describe what I want in reasonably clear human language.
But then, let's take that a step further: Imagine that we now preserve the description - nay, the entire dialogue - that I, the developer, had with the generative tool to create that function. Now imagine that dialogue is preserved for all the functions in the codebase. Now imagine that that dialogue is what we feed into a "compiler" to generate code, in whatever language is appropriate for a platform. And computers run that, and we roll with it, until something goes sideways and only then do we break out debugging and forensics tools.
That’s awfully plausible, and it will change the nature of software development. My resume will no longer list "Python, C, C++, Javascript, Typescript". It will look more like an academic's resume: "I authored this, I was on the development committee for this, I co-authored the spec for that..."
Far from making people like me obsolete, it will make people like me more useful, because I can spend a lot more time looking at overall systems, consulting with people to nail down requirements, and drilling down into code with fancy tools to find bugs only when something goes wrong... All things that require long-term expertise.
But we’re not there yet, of course. Maybe another 2 or 3 years?
no subject
Date: 2024-05-02 11:43 pm (UTC)Good point about evaluating the contribution. I believe these days the management is a little bit in panic. And many bullshitters are too.
But see, as to what use is there - I like copilot, it slaps together good comments (on many occasions the are not so good, but are easily fixable). What I really appreciate is how it slaps together scripts. It definitely knows better.
no subject
Date: 2024-05-03 05:29 am (UTC)No need to wait for a year. I've already done that in several ways. It really depends on how complex a function you want. That's what may still be in the future.
One time I was literally prompting with a function specification. Write a function that will take a hexadecimal representation of an IP address and port and return a pair of
a string in decimal dotted notation and an integer. It wrote a pretty correct function, and I was able to iron out the details, like endianness and handling bad input, together with unit tests including negative test cases, through further interaction.
The other time was really eerie experience where the IDE suggested pieces of query from variable names, including reuse of other variables that contained other pieces of query. Like, ProjectCountWithErrors = ...and here the IDE plugged a query that checks the right source, adds a filter to find those that have errors, and aggregates to count unique projects.
no subject
Date: 2024-05-03 06:06 am (UTC)Did you submit your improved function back into the training data? :D
no subject
Date: 2024-05-03 06:21 am (UTC)