Chat GPT AI

  • Guest, it's time once again for the hotly contested and exciting FoH Asshat Tournament!



    Go here and fill out your bracket!
    Who's been the biggest Asshat in the last year? Once again, only you can decide!

Tuco

I got Tuco'd!
<Gold Donor>
45,320
73,376
View attachment 455035

Code:
import os
import csv

def search_files(directory, search_string, csv_file):
    with open(csv_file, 'w', newline='') as f:
        writer = csv.writer(f)
        writer.writerow(['File Path', 'String Value'])
        for root, dirs, files in os.walk(directory):
            for file in files:
                file_path = os.path.join(root, file)
                with open(file_path, 'r') as f:
                    file_contents = f.read()
                    if search_string in file_contents:
                        writer.writerow([file_path, search_string])

search_files('/path/to/directory', 'search_string', 'output.csv')
Thanks. There ended being a technical hurdle where the files had a mixed encoding of ascii and binary and the string that I wanted was in the ascii bits, so this result didn't help. I got access back and tried to engineer a request that would produce a working version but couldn't. Everything I generated would've either thrown an exception or failed in another way. No big deal though.
 
  • 1Like
Reactions: 1 user

Tuco

I got Tuco'd!
<Gold Donor>
45,320
73,376
another cool usage of the tool is that if you need to evaluate alternative functions for whatever reason it's easy to generate. Ex, the above request with glob.

"write a python script that uses the glob function to recursively search a directory structure for files that have a given string and then dumps the value of that string and the filepath to a csv file."



Code:
import glob
import os
import csv

def search_files(directory, search_string):
    # Use glob to recursively search for files in the directory
    for file_path in glob.glob(os.path.join(directory, '**/*'), recursive=True):
        # Check if the file is a regular file and not a directory
        if os.path.isfile(file_path):
            with open(file_path, 'r') as file:
                # Read the contents of the file
                file_contents = file.read()
                # Check if the search string is in the file contents
                if search_string in file_contents:
                    yield file_path, search_string

# specify the directory to search
directory = 'path/to/directory'
# specify the search string
search_string = 'example'

# Create a new CSV file to store the results
with open('search_results.csv', 'w', newline='') as csvfile:
    writer = csv.writer(csvfile)
    # Write the header row
    writer.writerow(['File Path', 'Search String'])
    # Write the results to the CSV file
    for file_path, string in search_files(directory, search_string):
        writer.writerow([file_path, string])

This script uses the glob.glob function to recursively search for all files in the specified directory and its subdirectories. It then opens each file and checks if the specified search string is present in the file's contents. If the search string is found, the script writes the file's path and the search string to a CSV file named search_results.csv
 
  • 1Like
Reactions: 1 user

Mist

Eeyore Enthusiast
<Gold Donor>
30,274
22,008
I was gonna say, if it was just ascii files like logs, you can just do that with bash commands...
 
  • 1Like
Reactions: 1 user

Tuco

I got Tuco'd!
<Gold Donor>
45,320
73,376
I was gonna say, if it was just ascii files like logs, you can just do that with bash commands...
write a single line bash command that recursively searches a directory structure for files that have a given string and then dumps the value of that string and the filepath to a csv file.


grep -r "string" directory/* | awk '{print $1 "," $2}' > output.csv


From my experience there are alwayyyys dumbass complications or new features to tag on (ex: the binary nonsense, plus the data I wanted was actually in some json and the search was just the key, so I had to decode the json object or search for it or whqatever). So I rarely try to engineer a bash command for this stuff.
 
  • 1Like
Reactions: 1 user

Deathwing

<Bronze Donator>
16,315
7,313
From my experience there are alwayyyys dumbass complications or new features to tag on (ex: the binary nonsense, plus the data I wanted was actually in some json and the search was just the key, so I had to decode the json object or search for it or whqatever). So I rarely try to engineer a bash command for this stuff.
Normally, it takes a few weeks or months for me to revisit my own code and go "wtf was I doing here?". For bash or shell, it's the next day.
 
  • 1Like
Reactions: 1 user

Ossoi

Tranny Chaser
15,401
7,666
1674594659149.png


I can't even be bothered to read the rest, what a failure!
 

Sanrith Descartes

Veteran of a thousand threadban wars
<Aristocrat╭ರ_•́>
41,351
107,244
Lawyers and Paralegals be like...

Star Trek GIF


"Write a legal brief on the unenforceability of non-compete clauses under Pennsylvania law and cite at five cases in support of the brief."

The unenforceability of non-compete clauses under Pennsylvania law is a well-established principle in the state's jurisprudence. Non-compete clauses, also known as restrictive covenants, are agreements that limit an individual's ability to work for a competitor or start their own business after leaving a current employer. These clauses are often used by employers to protect their confidential information and customer relationships.

However, under Pennsylvania law, non-compete clauses are generally unenforceable unless they are reasonable in scope, duration, and geographic reach. This principle is supported by several cases, including:

  1. "Weiman v. Dowling" (1999) in which the court held that non-compete clauses must be reasonable in duration, as well as scope and geographic reach, in order to be enforceable.
  2. "Buckman v. Buckman" (1946) in which the court stated that "restrictive covenants in employment contracts are not favored in the law and will not be enforced unless reasonable under all the circumstances."
  3. "Greenspan Bros., Inc. v. Shulman" (1939) in which the court held that non-compete clauses must be "narrowly drawn so as not to impose any undue hardship on the employee."
  4. "Solomon v. Solomon" (1939) in which the court held that non-compete clauses must be "reasonable as to time, place, and scope" in order to be enforceable.
  5. "Kislak v. Kislak" (1935) in which the court held that non-compete clauses must be "reasonable, both as to time and territory, and as to the nature of the business."
In conclusion, under Pennsylvania law, non-compete clauses are generally unenforceable unless they are reasonable in scope, duration, and geographic reach. Employers seeking to protect their confidential information and customer relationships must draft their non-compete clauses narrowly and with the above-cited principles in mind.
 
  • 2Like
  • 2Worf
Reactions: 3 users

Tuco

I got Tuco'd!
<Gold Donor>
45,320
73,376
I can definitely see a huge upside to setting up a semi automated system to generate chatgptish responses on higher profile forums. 2024 is going to be bots talking to bots even more than ever.
 
  • 3Like
Reactions: 2 users

Daidraco

Golden Baronet of the Realm
9,071
9,098
A lot of stuff you ask this "ai" is limited, or similar. You guys are talking about having one that you have to pay for, and if I can have one that just gives an honest response w/o SJW, political, etc. influence - then ya, its a product Ill pay for.
 

Mist

Eeyore Enthusiast
<Gold Donor>
30,274
22,008
Yeah I would only sub to ChatGPT if I could sub to a version with all the "safeties" turned off.
 
  • 2Like
  • 1Truth!
Reactions: 2 users

Aldarion

Egg Nazi
8,815
24,055
Used this tonight for the first time in an actual work context. Turns out this is a very effective way to search for a long list of taxonomic terms that are readily available on wikipedia etc. Nothing that really looks like AI, but actually useful. Looking them up in specific databases has its limitations and takes thought about formats etc; this took no thought.
 
  • 2Like
Reactions: 1 users

Ossoi

Tranny Chaser
15,401
7,666
Used this tonight for the first time in an actual work context. Turns out this is a very effective way to search for a long list of taxonomic terms that are readily available on wikipedia etc. Nothing that really looks like AI, but actually useful. Looking them up in specific databases has its limitations and takes thought about formats etc; this took no thought.

This guy asked for a group invite earlier. I was like where do I know that name from lol. I just realized

 

Asshat wormie

2023 Asshat Award Winner
<Gold Donor>
16,820
30,963
Used this tonight for the first time in an actual work context. Turns out this is a very effective way to search for a long list of taxonomic terms that are readily available on wikipedia etc. Nothing that really looks like AI, but actually useful. Looking them up in specific databases has its limitations and takes thought about formats etc; this took no thought.
This is why Google is shitting its pants. This thing is going to replace web search.
 
  • 6Like
  • 1Truth!
Reactions: 6 users

Ukerric

Bearded Ape
<Silver Donator>
7,862
9,413
This is why Google is shitting its pants. This thing is going to replace web search.
And why Microsoft is already investing bajillions in it. Bing with AI = 5% market share for Google.
 
  • 4Like
Reactions: 3 users