You can print the time, cpu and memory usage per request and build metrics based on these outputs. This can be done in any major cloud, with open source tooling and even with simple linux coretuils. One tool to visualize such metrics is Grafana.
I used fiverr to make assets, and then either use fiverr to make a design that I can easily build with one of the website building tools (wix, etc.) or I try to build it myself.
The easiest way is to find a landing page you think looks good and use it as inspiration with your own colors, assets, etc.
I sure do love it when a script runs in the background to poll a website I didn't give consent to, doing god knows what, all whilst innocuously pressing the return key....
My PS1 is based on a 20 line prompt I saw on mastodon. It does git, virtualenv, conda, guix, and some path shortening. 20 lines.
After trying a few different CLI mail clients---mutt/neomutt, s-nail, etc.---I've come to love the approach of mblaze[0], _i.e._ just a collection of commands to interact with maildirs, which can be separately managed by OfflineIMAP or whatever.
I'm curious how mblaze+offlineimap compares to other similar setups: nmh[1], fdm[2], and getmail.
My biggest frustration with .internal is that it requires a private certificate authority. Lots of organizations struggle to fully set up trust for the private CA on all internal systems. When you add BYOD or contractor systems, it's a mess.
Using a publicly valid domain offers a number of benefits, like being able to use a free public CA like Lets Encrypt. Every machine will trust your internal certificates out of the box, so there is minimal toil.
Last year I built getlocalcert [1] as a free way to automate this approach. It allows you to register a subdomain, publish TXT records for ACME DNS certificate validation, and use your own internal DNS server for all private use.
— This command hits servers as fast as possible. Not sorry. I have encountered a very small number of sites-I-care-to-mirror that have any sort of mitigation for this. The only site I'm IP banned from right now is http://elm-chan.org/ and that's just because I haven't cared to power-cycle my ISP box or bother with VPN. If you want to be a better neighbor than me, look into wget's `--wait`/`--waitretry`/`--random-wait`.
— The only part of this I'm actively unhappy with is the fixed version number in my fake User-Agent string. I go in and increment it to whatever version's current every once in a while. I am tempted to try automating it with an additional call to `date` assuming a six-week major-version cadence.
— The `--reject-regex` is a hack to work around lots of CMS I've encountered where it's possible to build up links with an infinite number of path separators, e.g. an `www.example.com///whatever` containing a link to `www.example.com////whatever` containing a link to…
— I am using wget1 aka wget. There is a wget2 project, but last time I looked into it wget2 did not support something I needed. I don't remember what that something was lol
— I have avoided WARC because I usually prefer the ergonomics of having separate files and because WARC seems more focused on use cases where one does multiple archives over time (as is the case for Wayback Machine or a search engine) where my archiving style is more one-and-done. I don't tend to back up sites that are actively changing/maintained.
— However I do like to wrap my mirrored files in a store-only Zip archive when there are a great number of mostly-identical pages, like for web forums. I back up to a ZFS dataset with ZSTD compression, and the space savings can be quite substantial for certain sites. A TAR compresses just as well, but a `zip -0` will have a central directory that makes it much easier to browse later.
Here is an example of the file usage for http://preserve.mactech.com with separate files vs plain TAR vs DEFLATE Zip archive vs store-only Zip archive. These are all on the same ZSTD-compressed dataset and the DEFLATE example is here to show why one would want store-only when fs-level compression is enabled.
[lammy@popola#WWW] zfs list spinthedisc/Backups/WWW
NAME USED AVAIL REFER MOUNTPOINT
spinthedisc/Backups/WWW 772G 299G 772G /spinthedisc/Backups/WWW
[lammy@popola#WWW] zfs get compression spinthedisc/Backups/WWW
NAME PROPERTY VALUE SOURCE
spinthedisc/Backups/WWW compression zstd local
[lammy@popola#WWW] ls
Academic DIY Medicine SA
Animals Doujin Military Science
Anime Electronics most_wanted.txt Space
Appliance Fantasy Movies Sports
Architecture Food Music Survivalism
Art Games Personal Theology
Books History Philosophy too_big_for_old_hdds.txt
Business Hobby Photography Toys
Cars Humor Politics Transportation
Cartoons Kids Publications Travel
Celebrity LGBT Radio Webcomics
Communities Literature Railroad
Computers Media README.txt
Some of this could stand to be re-organized. Since I've gotten more into it I've gotten better at anticipating an ideal directory depth/specificity at archive time instead of trying to come back to them later. Like `DIY` (i.e. home improvement) there should go into `Hobby` which did not exist at the time, `SA` (SomethingAwful) should go into `Communities` which did not exist at the time, `Cars` into `Transportation`, etc.
`Personal` is the directory that's been hardest to sort because personal sites are one of my fav things to back up but also one of the hardest things to try and organize when they reflect diverse interests. For now I've settled on a hybrid approach. If a site is geared toward one particular interest or subsulture, it gets sorted into `Personal/<Interest>`, like `Academics`, `Authors`, `Artists`, `Goth` (loads of '90s goths had web pages for some reason). Sites reflecting The Style At The Time might get sorted into `1990s` for a blinking-construction-GIF Tripod/Angelfire site or `2000s` for an early blog. Some times I sort personal sites by generation like `GenX` or `Boomer` (said in a loving way — Boomers did nothing wrong) when they reflect interests more typical of one particular generation.
No metaphors trying explain "complex" ideas, making them scary and seem overly complex. Instead, hands on implementations with analogy explainers where you can actually understand the ideas and see how simple it is.
Steeper learning curve at first but it is much more satisfying and you actually earn the ability to reason about this stuff instead of writing over the top influencer BS.
For anyone disappointed that Handbrake doesn't allow you to specify a final file size and automagically figure out the rest, calculating this is straightforward.
average bitrate [kbps] = target size [kilobits] ÷ length [seconds]
Example: You have 2'48" file that you need to be 5GB or less.
- 2'48" is 10,080s
- 5GB = 40,000,000 kb
- average bitrate = 40,000,000 kb ÷ 10,080s = 3,968 kbps
- If audio is 256 kbps, average video bitrate should be 3,712 kbps or less
If anyone from the Handbrake team is reading, thank you for all of your work on Handbrake. <3
I’ve been using GoTTY (https://github.com/yudai/gotty) to do the same thing, combined with ngrok or Cloudflare tunnels to get a publically accessible URL. To enable multiplayer mode, just need screen/tmux.
I have a pretty strong natural filter. Years of therapy and three different therapists have yet to get me to a place where I can truly open up, even when writing on a page I know no-one else will see.
Two days ago I did weed for the first time. I accidentally took a much larger hit than I meant to. For the first 15 minutes I had a complete out-of-body experience; nothing seemed real, I felt like I was floating through space, able to peer into reality at will, still not quite trusting that it wasn't all a dream. I'm pretty sure the term of art for what I experienced would be a psychotic episode.
After that... I lost my filter. Like, it was gone. And my mouth was a conveyor belt connected to the emotional part of my brain. No logic, just uninhibited speech, for 45 minutes, all while sobbing harder than I have in my entire life.
I exposed every last deep, dark secret I possessed. My fear of never being good enough. My fear that everyone in my life will leave me at some point. My fear that I've done so many hurtful things over my life that I'm unworthy of love and of the friends I have. And many more things I won't be sharing with the world, at least not yet.
Inside, it felt like my brain and my mouth were connected by a pipe, and all I could do was sit back and watch in horror as the very depths of my mind were laid bare for all to see.
A good friend of mine was with me while I did it. She heard everything. It's a mark of our friendship that she held me, reassured me that she loved me for who I am, not who I want people to think I am. Our friendship is even stronger now, something I would never have thought possible before having that experience.
It was terrifying, and yet oddly theraputic. I'm seriously considering cannabis-assisted psychotherapy now.
---
I guess what I'm trying to say is: there are a number of substances that induce mind-altering states in ways that are relatively safe and free from long term effects. If you're someone who can't seem to open up naturally, don't be afraid to try them. They just might change your life.
(As with everything, consume appropriately and safely. Have someone experienced in the substance you're consuming keep an eye on you. And for god's sake, don't do anything known to be addictive or to have severe negative side effects.)
Also, find friends who you can truly be yourself around, who love you even when they know the absolute worst about you. It makes all the difference to know that someone loves you not for who you want the world to think you are but for who you actually are.
Zellij instead of tmux (not necessarily better, but it's easier to use)
Xonsh instead of bash (because you already know Python, why learn a new horrible language?)
bat instead of cat (syntax highlights and other nice things)
exa instead of ls (just nicer)
neovim instead of vim (just better)
helix instead of neovim (just tested it, seems promising though)
nix instead of your normal package manager (it works on Mac, and essentially every Linux dist. And it's got superpowers with devshells and home-manager to bring your configuration with you everywhere)
rmtrash instead of rm (because you haven't configured btrfs snapshots yet)
starship instead of your current prompt (is fast and displays a lot of useful information in a compact way, very customizable)
mcfly instead of your current ctrl+r (search history in a nice ncurses tui)
dogdns instead of dig (nicer colors, doesn't display useless information)
amp, kakoune (more alternative text editors)
ripgrep instead of grep (it's just better yo)
htop instead of top (displays stuff nicer)
gitui/lazygit instead of git cli (at least for staging, nice with file, hunk and line staging when you have ADHD)
gron + ripgrep instead of jq when searching through JSON in the shell (so much easier)
I've contracted an ex Azure DNS team member to write up articles about DNS [1] and published it for free. I considered my DNS knowledge okay, but I learned something every article he wrote.
If you want to be better at DNS than >99% of your colleagues for the rest of your career, then invest a single day in reading those.
If you wanted to learn, I really recommend Operating Systems: Three Easy Pieces (OSTEP). I thought it was excellent and pretty easy to follow. https://pages.cs.wisc.edu/~remzi/OSTEP/
> But bat sonar, though clearly a form of perception,is not similar in its operation to any sense that we possess, and there is no reason to suppose that it is subjectively like anything we can experience or imagine.
Philosophy aside, bat sonar is different from the senses we possess in an really interesting way. Our eyes have excellent spatial resolution (up/down, left/right), some rough depth resolution (from stereo), and no innate sense of speed. Our brain processes the signal to fake even better spatial resolution, infer more about depth (small vs far away) and more about speed (angle changes, among other things).
Bat sonar is completely different. Spatial resolution is poor. But they have first-class depth and speed information! They don't necessarily know where something is, but know exactly how far away it is, and how fast that distance is changing. One must suppose that their brains synthesize more spatial information from these senses, but that spatial information is still not going to feel reliable.
I'd love to be able to experience that for an hour. To live in this world where distance and speed are primary senses, and cross-range information is much fuzzier. What an incredibly different way to see the world.
Talking about emulation/VMs, I have recently played with multipass (from canonical) to run ubuntu VMs on my Apple M1. It works like a charm! In the past I have tried VMware and Vagrant, and while they somehow work, I’m always running into issues. More recently I tried qemu, and I spent many days trying to figure out the correct set of command line parameters to run linux.
The bad thing about multipass is that it only runs Ubuntu VMs. The good thing (besides working out of the box) is that mutipass uses qemu behind the curtains, and as part of the logs multipass spits out, one can see the complete set of command line parameters that was used to spin up the VM (so that can be used as a blueprint to run other VMs besides Ubuntu)
This is used to a very cool effect in Peter Watts' novel "Blindsight", about an alien species whose brains are so much faster than ours that they notice and exploit this deficiency in ours.
And then of course there's grafana which is definitely not for the feint of heart.
There are so many page-of-bookmarks style dashboards, but if I'm being honest, none of them are all that great. Of the above, I like heimdall the best for its cleanliness and simplicity, but its hardly customizable.
I think we'd be much better served by a really well thought-out framework for self-hosted/homelab dashboards with excellent API documentation that has pluggable modules for things like authentication, data sources (e.g. docker, db, config file, service APIs), and the front end. This would allow people to easily build the dashboard with the features they want, and make it even easier for people to contribute a variety of "themes" for endless customization of how things are displayed.
The problem is that spam was/is so bad that extreme measures were taken to curb it. There are all kinds of invisible forces that you abutt that can be difficult to figure out, such as IP blacklists and the like. And even if you set everything up properly and host your email with a responsible host, Microsoft will still mark your mail as spam.
I host my own email server with Vultr on an OpenBSD VM using OpenSMTPD and Dovecot, relaying all outbound mail through SMTP2Go (their free tier more than meets my needs). I have all of the necessary DNS entries set to mark my mail as legit, and I sign all outgoing mail using strong 2048-bit RSA keys. Thus far, I'm able to send mail and not have it marked as spam (at least to everyone that I've corresponded with thus far). It was a lot of work to get to that point, but not terrible.
I'm a huge fan of the base16 color schemes - not for their appearance (though most look great), but for their ease of integration within the shell and vim. Just clone the repos below, drop a few lines in your shellrc/vimrc, then use a single bash command to change the scheme in both. No more mucking with Xresources.