Copilot causes popular projects to become more popular

I have been using Github copilot for the past several months and I honestly think its one of the best tools’ I have ever used. I suspect it is writing about 50% of the code I produce and only about 10-20% of the time it is wrong/needs to be changed.

Approximately a week ago I started using HTMX and noticed that I would have to explicitly force co-pilot to use htmx by telling it in the comments despite it being littered all over my project. After an inflection point I no longer had to tell it. A similar experience happened on the backend side as co-pilot would often recommend using requests versus urllib. This raises several interesting questions:

  1. Will the popular libraries become more popular as copilot, by default, will suggest using them?

  2. How will computer science eduction move forward? Outside of fundamental topics any sort of simple high school or college level programming assignment can be done in 10 seconds using copilot

  3. My experience is skewed as a full stack developer who often jumps between 3 or 4 different languages. The biggest advantage for me is that copilot can help me remember’ the different formatting of the languages rather than the logic of the code. I suspect that the advantages of copilot is lessened by those who work in a single language but I can’t confirm that?

  4. What is the ideal price point for this? I would easily pay 50-100$ a month to use this but I suspect I am in the minority.

October 24, 2022

Games based on Real World Locations

As a person who travels often, I always like to do some research before I visit a new location. If I have the time, playing a game based on the location can give me a little bit of understanding of the area. I feel it immerses you and makes you appreciate the culture of the location more. It’s also a great way to make your kids more interested. One of my favorite things is hearing my little one make a reference to something they saw in the video game and make the connection to real life. Here is a short list that I was able to come up with

Africa - Far Cry 2

Alaska, United States - Red Dead Redemption 2

Blackwater, Missouri - United states - Red Dead Redemption 2 (Blackwater)

Chernobyl, Ukraine - Call of Duty 4: Modern Warfare, S.T.A.L.K.E.R

Chicago, Illinois - Watch Dogs

Donetsk, Ukraine - Call of Duty Warzone, Watch Dogs

Egypt- Assassins Creed

Hong Kong, China - Sleeping Dogs

Himalaya, Asia - Far Cry 4

Hollywood, California -Overwatch (Hollywood Map <-> Paramount Studios)

Italy - Assassins Creed 2 and Assassins Creed Brotherhood, Forza Horizon 2, Final Fantasy XV (Altissia), Dark Souls (Anor Londo)

Los Angeles, California - L.A. Noire, Grand Theft Auto: San Andreas

Louisiana, United States (Parts of it) - Red Dead Redemption 2

Las Vegas, California - Fallout: New Vegas

Manchester, England - Resistance: Fall of Man

Miami, Florida - Grand Theft Auto: Vice City

Moscow, Russia - Metro 2033

New Orleans, United States - Red Dead Redemption 2

Northeast Colonial America - Fallout 4

New York, New York - Grand Theft Auto 3 & 4, The Division, Marvel’s Spider-Man, The Warriors

Paris, France - Assassins Creed Unity

San Francisco, California, United States - Watch Dogs 3

Seattle, Washington, United States - Infamous Second Son, The Last of Us 2

Syria - Assassins Creed

Thailand - Tomb Raider Underworld

Tokyo, Japan - Persona 5, Yakuza

Washington, DC, United States - The Division 2, Fallout 3

Yosemite Valley, California, United States - Red Dead Redemption 2

Here is a cool list on a map. Feel free to contribute. Map

October 4, 2022

Dockerizing” Your Side Projects aka Moving all your side projects off of Heroku

Now that Heroku has discontinued its free plan I had the pleasure of taking 9 years of side projects and trying to figure out where to host it. A couple weekends later I ended up with a solution that I am really happy with:

Everything uses the same docker-compose (for the most part):

  1. Web server with Flask
  2. Database with Postgres
  3. Backups of database with prodrigestivill/postgres-backup-local

Using a logdna docker container all the output goes to LogDNA (now mezmo) (costs only dollars a month). Everything runs on a 10$ month DO server. Memory usage has increased in a significant but manageable way (excluding the one time memory leak).

I use certbot (no cost) for https if needed.

I’m still paying ~50-60$/month on heroku for production projects that I don’t feel comfortable taking off yet. I anticipate moving the smaller paid projects over the next year.

The only thing I wish I could find was some sort of monitoring tool to tell me when a docker container is not working’ anymore.

September 20, 2022

Ethereum Merge Hot Takes

Ethereum Merge Hot takes:

  1. If you are mining, now is the time to sell your hardware. Look at whattomine.com. Look at the volume done by ethereum. Now scroll down and see how little volume you find with every other coin. All that hardware is going to collapse every other shitcoin.

  2. If you are building a new computer wait until after the merge to buy parts.

  3. Is the merge priced in? Who knows. I would argue that its most likely ether price will be flat until the merge with a small possibility of going higher. Taking a long position now and selling right before the merge seems like a safe play.

  4. Why not hold until after the merge? Despite extensive testing the potential for a long tail bug or massive catastrophe will keep me away. On the other hand if eth price crashes it will present a rare and good opportunity to accumulate more eth.

  5. Staking is expensive. You need the 32 eth for staking, plus a decent rig with 1-2 TB of storage, plus additional eth for transaction fees, plus you can’t withdraw eth until withdraws are enabled? Maybe using rocketpool is a better idea? Need to look into it more.

September 1, 2022

Stuck on Waiting For Image’ on Anydesk

As teamviewer has incessentaly complained that I am using their product for commercial usage I have now been forced to switch to another VNC application. Anydesk has been great except for connecting to a computer that does not have a monitor. When you do so you’ll find that after you try to connect you will be stuck on a screen that says Waiting For Image’. The way around this is to click on the keyboard button -> Ctrl + Alt + Delete’. Then once that shows up click cancel. The screen will be frozen again. Now click on the Adapt Resolution’ button under the button that looks like a monitor. Everything should start working again.

To avoid this un-necessary clicking you can buy a dummy plug on amazon.

July 19, 2022

Subdomain Enumeration For Profit

I was inspired by this post: https://www.reddit.com/r/wallstreetbets/comments/p50n5p/amzn_is_up_to_something_with_afrm_obsessive_dd/

The TLDR was that people on the internet noticed that Amazon was going to implement Affirm, a pay later company, into their platform. The interesting part is that they figured it out before it was publicly announced.

Realizing that large companies have a large digital footprint it was likely that something like this will happen again. One avenue that I had not seen other people investigae was subdomains. The thought process is that companies will put developer functionatly or potential announcements on a subdomain.

Subdomain enumeration is the process of finding subdomains for a website. Unfortunaly most subdomains are not publicly availabe (unless the DNS server exposes a full DNS zone). Some would suggest using brute force, others say you can try to crawl links or searh engines to find new subdomains. I went with Anubis https://github.com/jonluca/Anubis. Anubis basically combines a whole bunch of methods.

Using bash and zapier I built something that would send me a text message whenever a new subdomain was found.

I started off with the sites: robinhood.com, amazon.com, opensea.com, coinbase.com

Bash Script

#!/bin/bash

# run the command
SITE="amazon.com"
file1="amazon.txt"

sqlite3 amazon.db  "create table n (id INTEGER PRIMARY KEY,subdomain TEXT);"

while [  1  ]; do
        echo Scraping
        anubis -tS amazon.com -o "$file1" # S for silent

        # get rid of last line (says how long it took to complete)
        sed -i '$ d' "$file1"

        # get rid of top 24 line (boiler plate from anubis)
        sed -i 1,24d "$file1"

        sort -o "$file1"{,} # sort inline

        echo "-----"
        cat "$file1"
        echo "-----"

        while IFS= read -r line; do
                printf '%s\n' "$line"
                select_output=$(sqlite3 amazon.db "select * from n where subdomain='$line'")
                if [ -z "$select_output" ]
                then
                    curl -X POST -H 'http_location: amazon' --data $line https://hooks.zapier.com/hooks/catch/467498/b11111
                    sqlite3 amazon.db  "insert into n (subdomain) values ('$line');"
                else
                    echo "already existrs"
                fi
        done < "$file1"
done
July 5, 2022
Share this post if you enjoyed it :)
Subscribe to my newsletter to get notified of new posts
Follow Me On Twitter