IT/Software career thread: Invert binary trees for dollars.

ToeMissile

Pronouns: zie/zhem/zer
<Gold Donor>
3,688
2,478
Fellow employees at my org barely use it besides for simple questions. Technically we're limited to Microsoft Copilot but organization-scoped. I don't use it but I'm guessing this is a gimped model? Most complain that it's not great and lump all AI with that assessment.
The default copilot w/M365 is solid for office/admin type stuff and even some basic coding. I’ve created a few little tools to help out adjacent teams. Usually just a standalone html file with embedded JS, one pulls out labels, notes and lat/lon out of .kmz/.kml files that the user reviews and manually uploads to our project management system. The other converts/bundles .heic images into PDF. Nothing fancy but it would take a couple weeks to get something like that request/built/integrated into an existing system, I did both in about a day including review with the people who would be using them
 

TJT

Mr. Poopybutthole
<Gold Donor>
46,547
127,177
Copilot is by far the worst of them. We have been using both Copilot and Cursor for a year or so. The entire engineering org favors Cursor by over 60%. Myself included.

I hear Claude Code is "better" but its terminal centric design is objectively worse than Cursor's IDE based design. At least if you're competing to get Java/Python/C/etc devs working with it. I am sure its quite dope for system engineers and what have you.
 
Last edited:
  • 1Like
Reactions: 1 user

Khane

Got something right about marriage
21,553
15,450
I don't have much experience with the other agents but Copilot for sure needs to be guided, its still very useful but sometimes it outputs some truly absurd nonsense.

Last week I was troubleshooting an issue with invoking an external assembly that makes a database call from within xslt (don't ask). It really didn't know how to help with that, to the point that when I kept correcting it and letting it know its output didn't solve the issue it eventually started blaming the comments in the xslt, calling them malformed (they weren't). It was hilarious but also kind of tragic because there are probably vibe coders that very literally don't know something as simple as properly formatted comment tags in xslt/xml. And they would probably trust the agent to take them down that rabbit hole.
 
  • 1Like
Reactions: 1 user

ShakyJake

<Donor>
8,457
21,102
Copilot is by far the worst of them. We have been using both Copilot and Cursor for a year or so. The entire engineering org favors Cursor by over 60%. Myself included.

I hear Claude Code is "better" but its terminal centric design is objectively worse than Cursor's IDE based design. At least if you're competing to get Java/Python/C/etc devs working with it. I am sure its quite dope for system engineers and what have you.
I've been using Claude Code and Codex CLI and I actually prefer them over the IDE integrations.
 
  • 1Like
Reactions: 1 user

Deathwing

<Bronze Donator>
17,652
8,661
How do you guys prevent exfiltration of proprietary code and sensitive information by AI? Assume, if you aren't already, that I'm extremely uninformed on this topic. I have a long list of negatives wrt AI, but one fuckup and oops, source code is now part of some LLM's training model makes me not want to even start experimenting.
 

Khane

Got something right about marriage
21,553
15,450
How do you guys prevent exfiltration of proprietary code and sensitive information by AI?

giphy.gif
 
  • 1Solidarity
Reactions: 1 user

Furry

Email Loading Please Wait
<Gold Donor>
27,458
40,530
How do you guys prevent exfiltration of proprietary code and sensitive information by AI? Assume, if you aren't already, that I'm extremely uninformed on this topic. I have a long list of negatives wrt AI, but one fuckup and oops, source code is now part of some LLM's training model makes me not want to even start experimenting.
Reminds me how they tell us to never ever use ai with CNSI then I go into a CNSI briefing and the computer is suggesting relevant search results with the default copilot installation. I just assume the turbo geniuses in charge know what they’re doing.
 

ToeMissile

Pronouns: zie/zhem/zer
<Gold Donor>
3,688
2,478
How do you guys prevent exfiltration of proprietary code and sensitive information by AI? Assume, if you aren't already, that I'm extremely uninformed on this topic. I have a long list of negatives wrt AI, but one fuckup and oops, source code is now part of some LLM's training model makes me not want to even start experimenting.
With enterprise licensing they’re supposed to segregate your company’s data and prevent it from getting out 🤷‍♂️
 

Khane

Got something right about marriage
21,553
15,450
We live in a world where blatant corporate malfeasance and criminal behavior go *almost* entirely unpunished. You can crash the global economy or steal your customers identities with fake accounts and the worst that will happen is you pay a small fine.

They will say the data is private
They will say everything is protected

None of it will be true, there will be a *shocking* scandal somewhere down the line and it will all play out the same way.

It's easier to beg for forgiveness than ask for permission and all that. Why wouldn't the same kind of unfettered greed exist in these AI companies when they know they won't really get punished?

In other words, why should I care? I hate the company I work for anyway.
 
Last edited:

Sheriff Cad

scientia potentia est
<Nazi Janitors>
31,198
74,112
Most programmers don't understand AI enough to make this judgment(I include myself in that group). How can we expect lawyers to do so?
The lawyers are just taking what you guys (you guys being the tech experts) say about how it works technically and applying that to the data security provisions in contracts/statutes/regulations. They have absolutely no idea how it works under the hood.