Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> their code is so special that no one else could have ever come up with it independently

I'm worried about exactly the opposite: having Copilot help me write code that seems quite generic to me, but which in fact makes my code subject to a license I don't even know about, and/or simply violates copyright.

For an open-source project this could be embarrassing but probably fixable. It gets more complicated if FAANG is doing due diligence on your company. I can see Copilot being both an accelerant and, later, a liability for startups.



There's a setting on GitHub that blocks any suggestions that exactly match code in the training set. I doubt you'd ever get in trouble for code that was similar in structure but different variables etc from existing licensed code (especially since most small snippets of code are not terribly unique to begin with).


I mean, it's nice that they have a setting for the bare minimum a lazy undergrad would do to avoid getting caught for plagarism — replace some of the words in the copied paragraph with replacements from a thesaurus. It's not something I'd personally expect to hold up under real scrutiny though.


AFAIK that's not enough, for instance see the long-standing industry practice that people working on the Important Stuff are not allowed to ever look at the source code of the Direct Competitor; or clean-room reverse engineering, etc.

I guess time will tell how much acquiring companies (my worry) care about Copilot. Given the difficulty hiring good devs, and the productivity level of body-shop devs, I see it getting a whole lot of use very soon, acknowledged or not.


There's a big difference between reverse engineering (i.e. intentionally writing software that behaves identically to another piece of software), and writing your own code to solve your own problem that may superficially contain small portions of the similar logic as some other project. Copyrighted code has to be sufficiently creative and unique to qualify, otherwise after the first person wrote code to parse json from a web request, no one else would be able to do the same thing.


Then Microsoft should write this as a legal statement on their part that they will take responsibility for. But I doubt they will ever do that.


Microsoft is not the author of the software that copilot helps produce. the person sitting at the keyboard using copilot is the author.


This is a bit like saying if you hire a freelancer to do some work then you're the author of that work. I'm not too sure i agree with that.



Kind of interesting.. I would like to point out this seems to be specific for the US.

But also.. In that case, when I commission an artist to paint my portrait, surely I can't claim to be the artist.. But I'm no lawyer.

I'm not sure there is a contractual agreement in GitHub's co-pilot that says: "Any code you write here is commissioned work". But honestly I didn't read the T&C's.

So I think you MAY have debunked my analogy, but not the main reason for the analogy.


How is copilot not the author?


Copilot is software, it can't be the author, just like the OS isn't the author when you copy and paste. Authors have to be humans.


Copy and paste doesn't really write code, just copies it from one place to another. Copilot on the other hand does generate new potentially novel code.


It does sound like the value of code creators is going to soon see significant downward pressure.


I'm sure that's what people said when they went from punch cards to assembly, and from assembly to C, and from C to Java.... and yet, here we are. Tools that let us write higher level code faster, just allow us to create more complicated software in a reasonable amount of time.


I think the argument is now it takes considerably less brain power to do, thereby increasing the labor pool and devaluing the output.


That's still 100% true of the examples I mentioned. There's always a higher level to consider. When we moved to C, we could stop worrying about what registers we were using. When we moved to python/Java we could stop worrying about managing memory. When we moved to web frameworks we stoping writing the guts of our servers. And if anything, programmers have become even better paid, despite so many more people in the industry.


I agree with you--however, programmers have not become even better paid because society values programmers. They have become better paid because software is a relatively new artefact in human society which has taken the human life by storm, which has made software companies immensely profitable, which meant more companies wanted to create software and attract the people that could help them do it.

As software takes a back seat (or at least a "normal" seat) in society, would we see a normalization of income? Could this be hastened by the development and introduction of tools such as copilot?

Potentially, unless there are new / better things that humans can claim they can provide compared to AI tools. This is the point where I think you and I agree, and I think it's your primary argument in any case (unless I'm mistaken).


AI can code low level stuff. This one function. This small piece of logic. What it can't do is conceive of how to take a bunch of different functions and put them together to produce an actual product. It can't tell you if you should use postges or mongo. Programmers will always be needed, we'll just move up the stack, and we'll produce more value per hour of our work, justifying our high salaries.

Compare the visible output of someone writing in assembly vs someone writing on top of a modern web framework. Is assembly harder? Yeah. But the web framework is going to give you a usable product in a fraction of the time with way more features. And that's worth more money to the company you work for.

It's always going to be a knowledge worker's job. It's always going to reward experience and creativity and attention to detail. A lot of programming is looking at the world, seeing a gap in what exists, and figuring out what best fits that gap. An AI can't do that. Programming is making 1000 tiny decisions that can't possibly be specified completely by a product manager and need a human to weigh the tradeoffs.


> AI can code low level stuff. This one function. This small piece of logic. What it can't do is conceive of how to take a bunch of different functions and put them together to produce an actual product.

Thats what everybody in the chess world said: "AI can decide low level stuff. This one move. This small attack on a rook. What it can't do is conceive of how to take a bunch of different tactics and put them together to produce a game of chess."

...Until Deep Blue beat Garry Kasparov.

> It can't tell you if you should use postges or mongo.

Yeah, and then came: "It may be able to play chess, but it can't tell you how to play Go."

Look how that went.


The hard part about writing code isn't "how to write a for loop" and similar trivial things. Copilot make this process faster, but the hard part is still organizing your code so that it doesn't become a steaming pile of cowdung a few iterations down the line. That Copilot does not do for you.

So, unless you are a code monkey punching code into autogenerated skaffolding all day, your job is safe.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: