Chat GPT AI

Tuco

I got Tuco'd!
<Gold Donor>
42,881
69,056
103d 8h 7m
View attachment 455035

Code:
import os
import csv

def search_files(directory, search_string, csv_file):
    with open(csv_file, 'w', newline='') as f:
        writer = csv.writer(f)
        writer.writerow(['File Path', 'String Value'])
        for root, dirs, files in os.walk(directory):
            for file in files:
                file_path = os.path.join(root, file)
                with open(file_path, 'r') as f:
                    file_contents = f.read()
                    if search_string in file_contents:
                        writer.writerow([file_path, search_string])

search_files('/path/to/directory', 'search_string', 'output.csv')
Thanks. There ended being a technical hurdle where the files had a mixed encoding of ascii and binary and the string that I wanted was in the ascii bits, so this result didn't help. I got access back and tried to engineer a request that would produce a working version but couldn't. Everything I generated would've either thrown an exception or failed in another way. No big deal though.
 
  • 1Like
Reactions: Edaw

Tuco

I got Tuco'd!
<Gold Donor>
42,881
69,056
103d 8h 7m
another cool usage of the tool is that if you need to evaluate alternative functions for whatever reason it's easy to generate. Ex, the above request with glob.

"write a python script that uses the glob function to recursively search a directory structure for files that have a given string and then dumps the value of that string and the filepath to a csv file."



Code:
import glob
import os
import csv

def search_files(directory, search_string):
    # Use glob to recursively search for files in the directory
    for file_path in glob.glob(os.path.join(directory, '**/*'), recursive=True):
        # Check if the file is a regular file and not a directory
        if os.path.isfile(file_path):
            with open(file_path, 'r') as file:
                # Read the contents of the file
                file_contents = file.read()
                # Check if the search string is in the file contents
                if search_string in file_contents:
                    yield file_path, search_string

# specify the directory to search
directory = 'path/to/directory'
# specify the search string
search_string = 'example'

# Create a new CSV file to store the results
with open('search_results.csv', 'w', newline='') as csvfile:
    writer = csv.writer(csvfile)
    # Write the header row
    writer.writerow(['File Path', 'Search String'])
    # Write the results to the CSV file
    for file_path, string in search_files(directory, search_string):
        writer.writerow([file_path, string])

This script uses the glob.glob function to recursively search for all files in the specified directory and its subdirectories. It then opens each file and checks if the specified search string is present in the file's contents. If the search string is found, the script writes the file's path and the search string to a CSV file named search_results.csv
 
  • 1Like
Reactions: Edaw

Mist

Mythic Spellslinger
27,267
17,313
276d 11h 43m
I was gonna say, if it was just ascii files like logs, you can just do that with bash commands...
 
  • 1Like
Reactions: Edaw

Tuco

I got Tuco'd!
<Gold Donor>
42,881
69,056
103d 8h 7m
I was gonna say, if it was just ascii files like logs, you can just do that with bash commands...
write a single line bash command that recursively searches a directory structure for files that have a given string and then dumps the value of that string and the filepath to a csv file.


grep -r "string" directory/* | awk '{print $1 "," $2}' > output.csv


From my experience there are alwayyyys dumbass complications or new features to tag on (ex: the binary nonsense, plus the data I wanted was actually in some json and the search was just the key, so I had to decode the json object or search for it or whqatever). So I rarely try to engineer a bash command for this stuff.
 
  • 1Like
Reactions: Edaw

Edaw

Parody
<Gold Donor>
7,092
45,397
322d 10h 26m
Yeah, I love this thing. I wonder if it will be integrated into Windows/Office, etc.
 

Deathwing

<Bronze Donator>
15,430
6,363
209d 19h 31m
From my experience there are alwayyyys dumbass complications or new features to tag on (ex: the binary nonsense, plus the data I wanted was actually in some json and the search was just the key, so I had to decode the json object or search for it or whqatever). So I rarely try to engineer a bash command for this stuff.
Normally, it takes a few weeks or months for me to revisit my own code and go "wtf was I doing here?". For bash or shell, it's the next day.
 
  • 1Like
Reactions: Koushirou

Ossoi

Tranny Chaser
<Rickshaw Potatoes>
12,668
6,006
112d 22h 52m
I'm 90% sure someone on this forum was doing that for a while but I can't remember who it was.

Skynet itself could not write a better parody of "Stan" about you than what I wrote.
 

Ossoi

Tranny Chaser
<Rickshaw Potatoes>
12,668
6,006
112d 22h 52m
1674594659149.png


I can't even be bothered to read the rest, what a failure!
 

Sanrith Descartes

Veteran of a Thousand Thread-Ban Wars
<Gold Donor>
34,925
91,215
170d 14h 23m
Lawyers and Paralegals be like...

Star Trek GIF


"Write a legal brief on the unenforceability of non-compete clauses under Pennsylvania law and cite at five cases in support of the brief."

The unenforceability of non-compete clauses under Pennsylvania law is a well-established principle in the state's jurisprudence. Non-compete clauses, also known as restrictive covenants, are agreements that limit an individual's ability to work for a competitor or start their own business after leaving a current employer. These clauses are often used by employers to protect their confidential information and customer relationships.

However, under Pennsylvania law, non-compete clauses are generally unenforceable unless they are reasonable in scope, duration, and geographic reach. This principle is supported by several cases, including:

  1. "Weiman v. Dowling" (1999) in which the court held that non-compete clauses must be reasonable in duration, as well as scope and geographic reach, in order to be enforceable.
  2. "Buckman v. Buckman" (1946) in which the court stated that "restrictive covenants in employment contracts are not favored in the law and will not be enforced unless reasonable under all the circumstances."
  3. "Greenspan Bros., Inc. v. Shulman" (1939) in which the court held that non-compete clauses must be "narrowly drawn so as not to impose any undue hardship on the employee."
  4. "Solomon v. Solomon" (1939) in which the court held that non-compete clauses must be "reasonable as to time, place, and scope" in order to be enforceable.
  5. "Kislak v. Kislak" (1935) in which the court held that non-compete clauses must be "reasonable, both as to time and territory, and as to the nature of the business."
In conclusion, under Pennsylvania law, non-compete clauses are generally unenforceable unless they are reasonable in scope, duration, and geographic reach. Employers seeking to protect their confidential information and customer relationships must draft their non-compete clauses narrowly and with the above-cited principles in mind.
 

Daidraco

Ahn'Qiraj Raider
6,719
4,943
80d 18h 0m
A lot of stuff you ask this "ai" is limited, or similar. You guys are talking about having one that you have to pay for, and if I can have one that just gives an honest response w/o SJW, political, etc. influence - then ya, its a product Ill pay for.
 

Aldarion

Avatar of War Slayer
6,876
19,169
124d 4h 56m
Used this tonight for the first time in an actual work context. Turns out this is a very effective way to search for a long list of taxonomic terms that are readily available on wikipedia etc. Nothing that really looks like AI, but actually useful. Looking them up in specific databases has its limitations and takes thought about formats etc; this took no thought.
 
  • 2Like
Reactions: ShakyJake and wilkxus

Ossoi

Tranny Chaser
<Rickshaw Potatoes>
12,668
6,006
112d 22h 52m
Used this tonight for the first time in an actual work context. Turns out this is a very effective way to search for a long list of taxonomic terms that are readily available on wikipedia etc. Nothing that really looks like AI, but actually useful. Looking them up in specific databases has its limitations and takes thought about formats etc; this took no thought.

This guy asked for a group invite earlier. I was like where do I know that name from lol. I just realized

 

wormie

Critic of Critical Jewish Theory
<Gold Donor>
14,892
28,904
292d 11h 54m
Used this tonight for the first time in an actual work context. Turns out this is a very effective way to search for a long list of taxonomic terms that are readily available on wikipedia etc. Nothing that really looks like AI, but actually useful. Looking them up in specific databases has its limitations and takes thought about formats etc; this took no thought.
This is why Google is shitting its pants. This thing is going to replace web search.