I started learning prolog just a few months ago, when I stumbled upon https://linusakesson.net/dialog/ which is a spin on prolog optimized for writing interactive fiction.
I terms of Prolog implementations I played a bit with https://www.scryer.pl but it still feels rough around the edges.
SWI-Prolog is the most popular and most batteries included Prolog: https://www.swi-prolog.org
With its libraries and documentation it is a very practical language. What surprised me is, that you can easily produce amazingly small stand-alone binaries.
I really want to figure out how to have ants form bridges/towers to navigate gaps. It's on the roadmap for long-term goals, but there's a lot more to go in the short-term :) Above ground view where the ants leave the nest and collect food using pheromone trails is the next major feature coming up!
If you want to talk shop about ants, or help me tinker with code, discord link is in my profile :)
this page seriously undersells the versatility and utility of the units program
how long will my laptop take to charge at its current rate of charging?
You have: (22.8 Wh - 16.8 Wh)/7.4W
You want: time
48 min + 38.918919 sec
how long will a 2000mAh 18650 cell take to discharge at 2.5 watts, using a nominal voltage of 3.7 volts?
You have: 3.7 V 2 amp hour / 2.5 watt
You want: time
2 hr + 57 min + 36 sec
what energy density is that, so i can compare it to the volume needed for other forms of energy storage?
You have: 3.7 V 2 amp hour / circlearea(half 18 mm) 65 mm
You want: MJ/ℓ
* 1.6105936
/ 0.62088909
what's the specific energy of stoichiometrically mixed oxyhydrogen fuel?
You have: 44000 J/mol / ((2 hydrogen + oxygen)g/mol)
You want: MJ/kg
* 2.4423711
/ 0.40943818
okay but how much volume? say at atmospheric pressure?
You have: 3 mol gasconstant tempC(20) / 1 atm
You want: l
* 72.165351
/ 0.013857066
so that's how much energy density?
You have: 44kJ/_
You want: J/l
* 609.71089
/ 0.0016401216
(i may be off by a factor of 2 here)
how much energy can this capacitor hold?
You have: half (10V)**2 47 uF
You want: mJ
* 2.35
/ 0.42553191
how much energy density is that?
You have: half (10V)**2 47 μF / 15mm circlearea(3mm)
You want: J/ℓ
* 5.5409499
/ 0.18047447
how thick of a cable do i need to support me in a lightweight fabric-sling chair (or, from a different point of view, to pose a risk of accidental strangulation)? suppose its tensile strength is 2.7 gigapascals
You have: 120kg gravity / 2.7 GPa
You want: mm2
* 0.43585111
/ 2.2943615
You have: _
You want: circlearea
0.00037247244 m
You have: _
You want: mm
* 0.37247244
/ 2.6847624
note that this is the radius of the cable, not its diameter!
the datasheet says this 400×240 display uses 175 μW if all the pixels flip once per second and 60 μW for a static display. how much energy is that per pixel flip?
You have: (175 uW - 50 uW) / 400 240 1 Hz
You want: nJ
* 1.3020833
/ 0.768
if i overclock it to 60 fps how much power will it use?
You have: 60 Hz 400 240 1.3nJ
You want: μW
* 7488
/ 0.00013354701
and how many pixels is its diagonal?
You have: 400**2+240**2
You want:
Definition: 217600
You have: _**.5
You want:
Definition: 466.47615
what is the visual angle subtended by the sun as seen from earth?
You have: 2 sunradius/sundist
You want: milliradians
* 9.3049358
/ 0.10746984
You have: _
You want: dms
31 arcmin + 59.280781 arcsec
okay, how does that compare to the moon?
You have: moonradius 2 / moondist
You want:
Definition: 0.0090426639
on average the moon looks a little smaller, which is why annular eclipses are so common, but we can also calculate that total eclipses are possible because sometimes the moon looks bigger
You have: moonradius 2 / moondist_min
You want:
Definition: 0.0097530864
what percentage of this copper sulfate is actual copper?
You have: copper / (copper + (sulfur + 4 oxygen))
You want: %
* 39.813395
/ 0.025117175
how fast can i write to this slc flash chip without wearing it out in 53 years, assuming perfect wear leveling and no write amplification?
You have: 100 thousand 128 MiB/53 years
You want: bytes/second
* 8024.8943
/ 0.00012461223
how much fuel will this truck need to get across the country?
You have: 4000 km / (6.5 miles/gallon)
You want: l
* 1447.4744
/ 0.00069085852
how much is that per kilogram of lettuce or sodium lauryl sulfate or whatever?
You have: _/28 tonnes
You want: ml/kg
* 51.695513
/ 0.019344039
okay, but how much energy is 52 mℓ of diesel per kg of lettuce?
You have: _ 38.6 kJ/l
You want: kJ/kg
* 1.9954468
/ 0.5011409
how much data can i transfer overnight during unmetered hours on a 2400-baud modem?
You have: 8 hours 2400 bps
You want: MB
* 8.64
/ 0.11574074
how much power does the earth receive from the sun, assuming a solar constant of 1400 W/m²?
You have: 1400 W/m**2 * circlearea(earthradius)
You want: petawatts
* 178.52313
/ 0.0056015152
what would the equilibrium temperature of an object be if it were illuminated at that brightness and had a flat emission spectrum?
You have: (1400 W/m**2 / stefanboltzmann)**(1/4)
You want: tempC
123.24583
how about here in buenos aires at the winter solstice? first, what angle is the sun at anyway? we're at 34°36’ south, and the sun's latitude at the solstice is 23°26’
You have: 34° + 36' + 23° + 26'
You want: dms
58 deg + 2 arcmin
so that reduces the peak insolation to how much? here underneath the atmosphere we only get 1kW/m²
You have: cos(_) 1000 W/m^2
You want: W/m^2
* 529.4258
/ 0.0018888388
and that would be what temperature in equilibrium?
You have: (_/stefanboltzmann)**(1/4)
You want: tempC
37.698189
(integrating the sun's angle over the course of the day as the earth rotates is sadly beyond its capacities)
how much money could a sensible heat storage reservoir of 15 kg of water save me over 16 years? say power rates go down to only 2.5¢/kWh because of solar
You have: 1500 kcal/day * 16 years * 2.5 cents/kWh
You want:
Definition: 254.69556 US$
what's the surface area of a 300mm × 400mm × 150mm backpack? like how much cloth?
You have: 2 (300mm 400mm + 400mm 150mm + 150mm 300mm)
You want:
Definition: 0.45 m^2
okay but in cm²
You have: _
You want: cm2
* 4500
/ 0.00022222222
what's the electrical impedance of a 1000 μF cap at an audio highpass frequency of 20Hz?
You have: 1/(2 pi 20 Hz 1000 uF)
You want: ohms
* 7.9577472
/ 0.12566371
what's the time constant of 100 pF (roughly the smallest capacitance you can get in a macroscopic circuit with any degree of precision) and 1 MΩ?
You have: 100 pF 1 megohm
You want: ms
* 0.1
/ 10
okay. so how long will an 0.1μF cap take to discharge through a 100kΩ resistor from 5 volts down to a 1.3 volt threshold?
You have: ln(5V/1.3V) .1 uF 100kilohm
You want: ms
* 13.470736
/ 0.074234991
how many bits of precision does a linear adc need to be able to measure a difference of 1.8 millivolts if 1.5 volts is full-scale?
You have: log(1.8mV/1.5V)/log(2)
You want:
Definition: -9.7027499
if this oxygen absorber contains 7 grams of iron which oxidizes to Fe₂O₃, how much air can it remove all the oxygen from? air is 21% oxygen by volume (and roughly by mass) and weighs 1.2 grams per liter
You have: 3 oxygen / 2 iron * 7 g
You want: g
* 3.0082138
/ 0.33242318
You have: _/21%/(1.2g/ℓ)
You want: ℓ
* 11.937356
/ 0.083770641
i've lost 7 kg over the last two months; how much of a caloric deficit does that represent in my diet?
You have: 7 kg 3500kcal/pound / 2 months
You want: kcal/day
* 887.30034
/ 0.0011270141
if you were to spread the moon evenly over russia, how deep would it be?
You have: spherevol(moonradius) / area_russia
You want:
Definition: 1286134.4 m
how big is nigeria compared to massachusetts?
You have: area_nigeria/area_massachusetts
You want:
Definition: 33.793093
how many ounces of platinum is a ton of oil worth at 40 dollars per megawatt hour?
You have: tonoil 40 dollars/MWh
You want: platinumounce
* 0.58368883
/ 1.7132416
or in grams?
You have: tonoil 40 dollars/MWh / platinumprice
You want: g
* 18.154752
/ 0.055081997
You can use qemu/libvirt/kvm on any Linux host to run macOS pretty easily these days[1]. I run Ventura on unraid with nvidea gpu passthrough (w/ ryzen cpu even!) and it’s been fairly painless.
You can also run macOS in docker, but it’s ultimately running through qemu/kvm as well[2]
I'm surprised the article didn't tell how to run diagnostics on an M1 Mac:
Shut it down, hold the power button until you get a screen with some buttons, hold down Command-D until it brings up a diagnostics screen, and answer a couple questions (your language, run local, etc.). Then it takes a minute and posts results. And lastly you click Reboot.
A good cartoon is that a form factor is the function that describes how an object deforms when exposed to an outside influence with a particular momentum. The form factor is a function of momentum.
There are many different kinds of outside influence. They can be scalar (think: just increasing the pressure uniformly), vector (put in an electric field), tensor (zap with a gravitational wave), pseudovector (magnetic field), pseudoscalar (zap with a pion).
Of course, you can apply a scalar outside influence and a vector at once. But the scalar, vector, tensor, pseudovector, and pseudoscalar labels denote different representations of the Lorentz group [lorentz].
What's more: the Wigner-Eckhart theorem [wigner] basically says [cheat] that the response can be factored into three pieces: the strength of the external influence, a factor that depends only on the representation of the external influence, and a factor that depends only on the property of the thing you're talking about (a proton, in this instance).
So people call it the gravitational form factor because if you exposed the proton to a gravitational wave, it's the thing you need to know about the proton to know how it deforms.
Note that because of the factorization you don't actually have to zap the proton with a gravitational wave! You can measure it by zapping the proton with other stuff, as long as you can get that stuff to have the right rotational properties or measure the response to many different perturbations and sum the responses the right way to mock up a tensor operator. The experiment at JLab doesn't use gravitational waves, it uses these latter approaches.
Roughly speaking at zero momentum the form factor is the charge of the object you measure if it's just sitting there. So the electric form factor evaluated at zero momentum is the electric charge, the gravitational form factor evaluated at zero momentum is the mass.
What are radii? Express the form factor as a function of momentum^2 [possible]. In units that physicists like to work in (where c=1, hbar=1), the units of momentum are 1/length. Expand the form factor as a Taylor series in momentum^2 and you will get
form factor(p) = charge + # radius^2 p^2 + ...
where # is a known dimensionless number.
The above story is a cartoon but can be made more-or-less precise depending on how much quantum field theory you learn.
cheat: this is a little bit of a cheat, it's only true to leading order in a taylor series in the strength of the external influence.
possible: it's always possible to arrange this, or at least to separate the momentum dependence into a factor dictated by the rotational symmetry properties and another factor dictated by the object, just like in the Wigner-Eckhart theorem.
This post has driven the most traffic to my site ever. Forgive me for not understanding the social rules here, but I made an account today to join the incredible discussion here. If you have any questions I will do my best to answer them. However, you've all been answering each other's questions as well as I ever could've already.
True Color support is a big enough reason to switch from Terminal.app to iTerm2.app.
But beyond the cool Quake-style "hot key" overlay terminal that's always at the ready, once you start using other features of iTerm, you'll really appreciate all the amazing features it has!
For example:
- being able to press Cmd-Shift-E to quickly show/hide timestamp overlays for your entire scrollback
- pressing Opt-Cmd-B to open the time-slider of the Instant Replay feature to go back and grab output from the terminal that's not in your scrollback because it was clobbered by `less` messing up your screen
- pressing Cmd-/ to highlight the active cursor
- pressing Opt-Cmd-M to set an annotation at the cursor so you can quickly jump through your scrollback to them with Opt-Shift-Cmd-↑ and Opt-Shift-Cmd-↓
- want to know when a long-running command finishes? Just press Opt-Cmd-A in that window or pane to set an alert and you'll get a macOS notification when the prompt appears (be sure to enable Shell Integration: https://iterm2.com/documentation-shell-integration.html)
the list goes on and on... like `vi`, iTerm rewards study of its capabilities.
Zellij instead of tmux (not necessarily better, but it's easier to use)
Xonsh instead of bash (because you already know Python, why learn a new horrible language?)
bat instead of cat (syntax highlights and other nice things)
exa instead of ls (just nicer)
neovim instead of vim (just better)
helix instead of neovim (just tested it, seems promising though)
nix instead of your normal package manager (it works on Mac, and essentially every Linux dist. And it's got superpowers with devshells and home-manager to bring your configuration with you everywhere)
rmtrash instead of rm (because you haven't configured btrfs snapshots yet)
starship instead of your current prompt (is fast and displays a lot of useful information in a compact way, very customizable)
mcfly instead of your current ctrl+r (search history in a nice ncurses tui)
dogdns instead of dig (nicer colors, doesn't display useless information)
amp, kakoune (more alternative text editors)
ripgrep instead of grep (it's just better yo)
htop instead of top (displays stuff nicer)
gitui/lazygit instead of git cli (at least for staging, nice with file, hunk and line staging when you have ADHD)
gron + ripgrep instead of jq when searching through JSON in the shell (so much easier)
The cause in this case was AWDL (Apple Wireless Direct Link.) Holding the Option key while clicking the Wi-Fi icon and clicking "Enable Wi-Fi Logging" and then checking /var/log/wifi.log will show AWDL scans starting and ending randomly, and when the scan is active it causes latency spikes every 1s like clockwork. Unrelated to AWDL, but if a process is requesting a Wi-Fi network scan (different from an AWDL scan), /var/log/wifi.log will also tell you the name of the process, such as "locationd" when the Location Service needs your location. (Tangential, but the locationd process rarely causes these latency spikes for me - on a default macOS install it very rarely requests scans in my experience, backed by my analysis of the log.)
AWDL has to be used for things like AirDrop, so it's expected to have this latency increase while you have the AirDrop window open scanning for nearby devices / sending files to other devices. There are other uses of AWDL (AirPlay, Auto Unlock, Universal Clipboard at the very least)[0], but I don't know what was triggering it so actively in my case... and why it wasn't happening on my M1 Air. It also wasn't always happening in the background like this, it just started that day.
The "fix" was to disable the awdl0 interface, but that may also cause AirDrop/AirPlay and related services to not function (I did not test.) It's easy to re-enable it though.
To disable:
ifconfig awdl0 down
To enable:
ifconfig awdl0 up
Upon disabling, the latency spikes go away permanently.
Could you ELI5 what the "problem of motion" is supposed to be? I've read the abstract, the first one and a half pages and the conclusion of the paper you linked, and I'm still confused what problem exactly you are seeking a solution to (or in fact demanding from GP).
I'm saying "you", not "author", because the paper's author seems to be interested in a very specific, somewhat niche question, which is studying the equation of motion of test particles (at rest) in alternative theories of gravity and in the situation where, in addition, the test particle is charged and interacting with a fixed gauge field. (One needs to be very careful with the term "gauge" here because the author confusingly uses it for both, the matter gauge theory / gauge group and the "gravitational gauge" group, i.e. coordinate invariance.)
This question might be interesting to a few select people but there is certainly no "problem of motion in gauge theories of matter" at large, at least not in the way you portrait it.
I mean, for classic gravity / General Relativity, one expects that, depending on whether the particle is charged or not charged, the equation of motion reduces to:
- the geodesic principle – i.e. the hypothesis that (uncharged) test particles at rest move along geodesics.
- a Lorentz-force-type law for gauge-charged test particles that (only) interact with a (fixed) gauge field and are otherwise at rest.
But both are quite well-established I'd say:
- The geodesic principle can actually be rigorously derived from the Einstein field equations for a large class of matter or situations[0]. Given this body of evidence, it's rather likely it's a mathematical theorem and does not actually need to be assumed as an axiom of General Relativity.
- The Lorentz law can already be derived[1] from the special-relativistic Lagrangian of the matter field and its coupling to the gauge field (where both fields are obviously classic, not quantum).
As for the latter, sure, strictly speaking the special-relativistic derivation (i.e. on a flat background) can only be a "local" one in light of General Relativity. In a fully relativistic derivation one should instead consider a curved background, i.e. the Einstein-Maxwell action (or a generalization thereof for arbitrary gauge fields). But then again – given the evidence for the geodesic principle – we know the Lorentz force must come from the interaction of the particle with the (fixed) gauge field (not gravity) and that interaction is largely "understood" – with the usual fine print that:
- forces are a classical concept but particles are actually quantum and there is backreaction (so the Lorentz force can only be the lowest-order term, anyway),
- obviously we don't really know how quantum fields work on curved backgrounds / in conjunction with General Relativity. Then again, we don't know how to make quantum fields mathematically rigorous on a flat background to begin with. So there is no point in asking for mathematical rigorisity in the context of deriving the Lorentz force from first principles when much larger issues would need to be tackled first.
So again, what "problem of motion" exactly would you like to see solved?
I use pipx for this. It isolates each tool in its own virtualenv. You might still have to install a specific Python version (although the latest 3.X works fine for me), but it takes some of the pain out of the process and makes sure you don't pollute your global/system install.
On a Mac, it could be as simple as (assuming you already use Homebrew):
While APL dialects are very nice for this sort of thing, they generally don't understand units of measure or know about physical constants; you have to put those into them yourself. Here are some of my recent units(1) queries:
141 pounds force 30 mm # in joules
1160/4
log(3)/3/(log(2)/2) # how much more efficient is one-hot ternary than one-hot binary?
5V 7 μs / 7.3 A
.0117% half avogadro mol / 1.251e9 years / (potassium+chlorine)g # how radioactive is lite salt?
3.27$/gallon # in $/liter
sqrt(2 2000 electronvolt/electronmass)
18.8 foot pounds force # in joules
163$/(7.9 g/cc * 1500 mm 3000 mm 3.2 mm) # cold rolled steel price is higher than steel sold by weight
m3/4 / 15 cfh
2 pi sqrt(200 um / gravity)
rsync do actually support profile via option aliasing. rsync uses popt(3) to parse its command line options and popt allows you to define aliases, so you can put something like this in ~/.popt:
rsync alias --sync -rpcLDvz --chmod=D0755,F0644
A call "rsync --sync foo@bar:baz/ baz" will expand the command into "rsync -rpcLDvz --chmod=D0755,F0644 ...".
The "program alias newopt opt" syntax is actually popt's thing (see Option Aliasing[1]) and it works with everything that uses popt(3) for its command line parsing.
Here's an interesting bit of...abuse :) Since Bash is effectively stringly-typed, it can be used as a functional programming language, with pipes similar to function composition.
e.g.: wtfp.sh
#!/usr/bin/env bash
map() {
local fn="$1"
local input
while read -r input; do
"${fn}" "${input}"
done
}
reduce() {
local fn="$1"
local init="$2"
local input
local output="${init}"
while read -r input; do
output="$("${fn}" "${output}" "${input}")"
done
echo "${output}"
}
filter() {
local fn="$1"
local input
while read -r input; do
"${fn}" "${input}" && echo "${input}"
done
}
add() {
echo $(( $1 + $2 ))
}
increment() {
echo $(( $1 + 1 ))
}
square() {
echo $(( $1 * $1 ))
}
even() {
return $(( $1 % 2 ))
}
sum() {
reduce add 0
}
map increment | map square | filter even | sum
Here is the source code to LLogo in MACLISP, which I stashed from the MIT-AI ITS system. It's a fascinating historical document, 12,480 lines of beautiful practical lisp code, defining where the rubber meets the road, with drivers for hardware like pots, plotters, robotic turtles, TV turtles, graphical displays, XGP laser printers, music devices, and lots of other interesting code and comments.
https://donhopkins.com/home/archive/lisp/llogo.lisp
Lars Brinkhoff got some of this code to run in MacLisp on an emulator! (I don't know how much of the historical hardware the emulator supports yet, but he's probably worked on some of that too. ;) )
Just a couple highlights from a detailed history of Logo that Brian and Leigh and others posted:
>From Brian Harvey:
>Many, many people have been involved in the development of Logo.
>Wally Feurzeig started the whole thing by organizing a group at Bolt, Beranek, and Newman, Inc., to study the educational effects of teaching kids a programming language. The first language they used, like most programming languages, was focused on numeric computation, and it was Wally's idea that kids would find it more natural to work in an area they knew better, namely natural language; therefore, he set up a team to design a language featuring words and sentences. Wally made up the name "Logo."
>The team Wally put together at BBN included Seymour Papert and Dan Bobrow. The three of them are credited as the designers of the first version of the language; Dan wrote the first implementation. In a BBN report dated August, 1967, credit for additional work is given to Richard Grant, Cynthia Solomon, and Frank Frazier.
>Seymour later started a Logo group at MIT, where Logo development continued. The MIT versions of Logo were substantially different from the BBN ones, both in the notations used and in the things the language could do. Most notably, turtle graphics was added at MIT.
>Among the many people who contributed to the development of Logo at MIT, at the risk of leaving someone out, are Hal Abelson, Paul Goldenberg, Dan Watt, Gary Drescher, Steve Hain, Leigh Klotz, Andy diSessa, Brian Silverman... oh, lots of people.
>I think that most of the early documents are out of print now, but whatever documentation there is of the early efforts will be in the form of technical reports from BBN and from MIT. You may have to visit Cambridge to find them!
>From Leight Klotz:
>In the mid 1970's, when the AI Lab Lisp Machine project was just getting underway, Marvin Minsky and Danny Hillis (later to found Terrapin, and still later, Thinking Machines) put together a project to build a Logo machine. It had two components: a PDP-11 processor (the 3500) and a separate vector-graphics display (the 2500). Guy Montpetit, a Canadian entrepeneur, funded development eventually, and a company called General Turtle was formed. General Turtle built and sold the 2500/3500 system. Henry Minsky, then about 12, worked on the design of the 2500, using the Stanford Draw program, one of the early electronics CAD systems. (The 2500 had this really great barrell shifter stolen from the Lisp machine design, but it was later found not to work, so it was never used.) [...]
>[...] Like Brian, I've left out many people who worked on Logo over the years: Brian Fox and Flavio Rose worked for me at Terrapin on a contract basis briefly, as did vagabond programmer Devon McCullough (who used to dial in with a 300 baud modem he'd written in entirely software using the parallel game port, with an 80-column mixed-case display done with 3x5 pixel characters; when the modem detected the call waiting click on the line, it would make the Apple II speaker make the telephone ringing sound -- a feature which I just saw a US patent filed on, not by Devon.), and the frustrated Sinclar QX programmer, who I suspect doesn't want his name used. Of course, there were tons more people at the AI Lab in the pre-commercial days...
>From Lars Brinkhoff
>Hello,
>I'm mostly researching PDP-10 software, especially MIT's Incompatible Timesharing System.
>I have recently stumbled across some of the LOGO group work. I have a copy of the Dazzle Dart game that ran on their PDP-11/45. It uses the Tom Knight vector display controller, so it's not easy to run it.
>Maybe it would be possible to get the original MIT PDP-11 LOGO running.
>[...] It's running now.
>[...] Now also BBN PDP-10 Logo, MIT CLOGO, MIT Lisp Logo, and hopefully soon MIT Apple II Logo (direct ancestor of Terrapin Apple II Logo).
Terrapin Logo for the Apple ][ and C64 came with a 6502 assembler written in Logo by Leigh Klotz, that they used to write Logo primitives (for controlling the turtle, etc).
It would be ambitious to make a self hosting 6502 Logo meta assembler, by porting the entire 6502 assembly language Logo source code to the 6502 Logo Assembler!
Leigh, wasn't the assembler that you used for the original Apple ][ version of Logo written in MacLisp running on a PDP-10?
The Apple II Source Code for the LOGO Language Found (adafruit.com) 379 points by mmastrac on Oct 4, 2018 | hide | past | web | favorite | 89 comments
Lars: The link in the article to the code from https://github.com/PDP-10/its is broken. I found a few references to it in the repo. Did you have to take it down, or did you move it somewhere else?
>"I too see the computer presence as a potent influence on the human mind. I am very much aware of the holding power of an interactive computer and of how taking the computer as a model can influence the way we think about ourselves. In fact the work on LOGO to which I have devoted much of the past years consists precisely of developing such forces in positive directions."
>Seymour Papert
>"Logo is the name for a philosophy of education and for a continually evolving family of computer languages that aid its realization."
>Harold Abelson
>"Historically, this idea that Logo is mainly turtle graphics is a mistake. Logo’s name comes from the Greek word for word, because Logo was first designed as a language in which to manipulate language: words and sentences."
>Brian Harvey
>Logo was initially created by Wally Feurzeig, Seymour Papert, Daniel G.Bobrow, Cynthia Solomon and Richard Grant in 1967 as part of a National Science Foundation sponsored research project conducted at Bolt, Beranek & Newman, Inc., in Cambridge, Massachusetts. In 1969 Dr. Seymour Papert started the Logo Group at the MIT Artificial Intelligence Lab. Throughout the 1970s the majority of Logo development was conducted at MIT in the Artificial Intelligence Lab and the Division for Study and Research in Education, using large research computer systems, such as ITS powered PDP-10.
>Our goal is to make that early Logo systems available to a wider audience of enthusiasts for exploration, experimenting and, of course, hacking.
[...]
>MIT APLOGO
>In accordance with Leigh L Klotz Jr., Hal Abelson directed the Logo for the Apple II project at MIT.
>MIT APLOGO was developed by Stephen Hain, Patrick G. Sobalvarro and Leigh L Klotz Jr. It was developed and cross-compilled for the Apple-II-Plus Personal Microcomputer on PDP-10 at the MIT LOGO Group. It is direct predecessor for Terrapin Logo. We have a source code for assembling an improved version from 7/9/81 at its/src/aplogo
> But how about curved space? Suppose I have two vectorsin curved space-they look like two droopy arrows.
No, they don't. In math, curved spaces are modeled through so-called manifolds which locally look like ℝ^n. In particular, at any point p of the manifold there's a tangent space, i.e. a linear space (higher-dimensional plane) tangent to the manifold at p. Vectors at p are just vectors in that linear space. So they are "straight" not "droopy".
On the tangent space of each point p you can now define an inner product g(p). The resulting family of inner products g is called a (Riemannian) metric on the manifold[0] and describes how lengths (of vectors) and angles (between vectors) can be measured at each point.
> Currently, we define the curvature of space with gravity or the deflection of light past massive objects. Is there a way to measure the curvature of space locally?
Yes, there is. In fact, the curvature[1] of a (Riemannian) manifold is a purely local quantity – it's basically the second derivative of the metric g, so it describes how the notion of length changes (more precisely: how the change in length changes) as you go from a point p to neighboring points.
There are other ways to express what curvature is, e.g. by locally parallelly transporting[2] a vector along a closed curve (and making that curve smaller and smaller) which basically measures how the notion of straight lines changes locally. (Though, since a line being "straight" means "locally length minimizing" this brings us back to the notion of length and, thus, the metric.)
Alternatively, if the manifold has dimension 2, there's a particularly simple way of looking at and interpreting curvature, see [3].
In any case, curvature being used to model gravity is entirely separate from that idea.
[0]: Provided this inner product "varies smoothly" as you move from p to a neighboring point q.
If you cannot understand it, then probably it doesn't matter for you, because you have no problem to solve. Most of the time, physicists are trying to invent a mathematical formula or construct to elegantly describe a physical process, to make an accurate prediction.
Just imagine, that we have a game, where we want to accurately predict next frames of a video. It's an interesting game on its own, because you need to understand deeply what happens on the video to be able to accurately predict behavior of all objects, animals, and persons in the video.
Such game requires a lot of skill, to accurately guess and predict, but most of the time it's not important for us, mere mortals. For example, we put a lot of effort into OpenGL, PBR, physic engines, etc. to make realistic games. Do you feel obligated to study all of that when you are interested in a realistic fly simulation? Do you feel obligated to study construction of AK when you like to play a 3D shooter?
If you really want to understand physics, then I suggest to perform experiments, or play with a physical simulation, or, even better, to implement your own physical simulation. Look, for example, at this beautiful simulation of black hole done in OpenGL shader:
Similar: using it to get syslog out of self-chrooting ssh with minimal trauma:
socat -u UNIX-RECV:/home/sftponly/user/dev/log,mode=666 UNIX-SENDTO:/dev/log
It's also an CLI IMAPS client, with history:
socat READLINE,history=$HOME/.imaps_history EXEC:'"openssl s_client -connect mailserver:993"'
or a CLI web browser with history:
socat -d -d READLINE,history=$HOME/.http_history TCP4:www.domain.com:www,crnl
or, if you have to interact with something that has no readline, say sendmail:
socat READLINE EXEC:"sendmail -bt"
A similar tool: spiped[0]. Connects two arbitrary TCP4, TCP6, or UNIX socket addresses, encrypting + authenticating data using a preshared key. For many applications it's basically socat + security.
Here is how they actually set the kernel for booting. It's hidden behind a `curl <url> | sh` in their post:
#!/bin/sh
bputil -d | grep "CustomerKC" | grep -v "absent"
KC=$?
if [ $KC -eq 1 ]
then
bputil -n -k -c -a -s
csrutil disable
csrutil authenticated-root disable
fi
cd /Volumes/Preboot
curl https://$LONGURL/linux.macho > linux.macho
kmutil configure-boot -c linux.macho -v /Volumes/Macintosh\ HD/
echo "Kernel installed. Please reboot";
So, it looks like they download their pre-compiled kernel onto a Preboot volume. This is then set as the booting kernel? How long until there is a grub-like option?
I should add -- I don't know what most of these commands are. It would probably be helpful if they had spelled these out more, but this is all a great place to start.
bputil - Utility to precisely modify the security settings on Apple Silicon Macs.
csrutil - Configure System Integrity Protection (SIP)
kmutil - replaces kextload, kextunload, and other earlier tools for loading and managing kexts
Apple's linker will automatically adhoc sign binaries on AS systems so it shouldn't require any work for most people.
Anything run from the Xcode UI (or Terminal if you use "spctl developer-mode enable-terminal" to show the Developer Tools group under Security > Privacy in System Preferences) and enable Terminal is exempt from GateKeeper notarization checks. You can also put other terminal clients in the same list and they get the same benefit (child processes exempt from GateKeeper).
In a similar note "DevToolsSecurity -enable" allows any admin or member of the _developer group to use the debugger or performance tools without authing first. (Normally you must auth the first time and the authorization can expire if you don't unlock your system after a certain amount of time).
I think there's no way to understand quantum computing without first understanding some linear algebra, specifically tensor products. How ten 2-dimensional spaces give rise to a 1024-dimensional space, how the Kronecker product of three 2x2 matrices is an 8x8 matrix, and so on. If you're comfortable with that, here's a simple and precise explanation of quantum computing:
1) The state of an n-qubit system is a 2^n dimensional vector of length 1. You can assume that all coordinates are real numbers, because going to complex numbers doesn't give more computational power.
2) You can initialize the vector by taking an n-bit string, interpreting it as a number k, and setting the k'th coordinate of the vector to 1 and the rest to 0.
3) You cannot read from the vector, but exactly once (destroying the vector in the process) you can use it to obtain an n-bit string. For all k, the probability of getting a string that encodes k is the square of the k'th coordinate of the vector. Since the vector has length 1, all probabilities sum to 1.
4) Between the write and the read, you can apply certain orthogonal matrices to the vector. Namely, if we interpret the 2^n dimensional space as a tensor product of n 2-dimensional spaces, then we'll count as an O(1) operation any orthogonal matrix that acts nontrivially on only O(1) of those spaces, and identity on the rest. (This is analogous to classical operations that act nontrivially on only a few bits, and identity on the rest.)
The computational power comes from the huge size of matrices described in (4). For example, if a matrix acts nontrivially on one space in the tensor product and as identity on nine others, then mathematically it's a 1024x1024 matrix consisting of 512 identical 2x2 blocks - but physically it's a simple device acting on one qubit in constant time and not even touching the other nine.
Nice short film. Langlands was a theory builder as opposed to someone like Erdos who was more interested in solving problems. Theory builders are often admired, but because the endeavor is so broad, very few of them emerge and even fewer are actually successful.
I like the part where he said he began to write before he understood everything, and in order to write he had to discover many things, and even had to discover them after he started to write.
It underscores the crucial role of writing in discovery. Most writers will tell you they are exploring the space during the writing process. Writing isn't a process of committing what you already know to paper; it's a process of learning what you don't know and or haven't considered. It often leads you down paths you would never expect. (this happens to me with my HN comments too -- I often myself writing a very different comment from the one I set out to write)
This is why I think a Ph.D. dissertation should be a continuously evolving collection of notes, and not something you "write-up" in the end after all the work is ostensibly done.
Sure thing! In broad strokes, the most formal way (in our field's typically frequentist paradigm) to calculate confidence/significance is to build a robust statistical model for your entire experiment (including the random effect that various uncertainties, etc will have on the outcome), and then to randomly synthesize many many outcomes of your experiment with respect to a given hypothesis. The outcome of each "pseudoexperiment" is boiled down to a single number, known as a test-statistic, and this way we can generate a distribution of that test statistic that gives you an idea of the probability of any given outcome. Then, when we do the experiment for real and get a single outcome, we can evaluate the compatibility of that outcome with the distribution of possible outcomes sampled from the statistical model (e.g. by calculating a p-value). Generally the Profile Likelihood Ratio is chosen as the test statistic at LHC experiments.
This is possibly the most highly-cited statistics paper in our field:
https://arxiv.org/abs/1007.1727
It provides asymptotic formulae that allow us to estimate these p-values without having to simulate our experiment tens of thousands of times. But it also gives a nice overview of the different test statistics one might use.
"Practical Statistics for the LHC" is a much more complete and thorough introduction, but it's quite long and technical:
https://arxiv.org/abs/1503.07622
"Statistics for searches at the LHC":
https://arxiv.org/abs/1307.2487
Similar to the previous one, but maybe more words and less math. Perhaps one of these will resonate with you more than the others!
Lastly, here is a public note that was created jointly by the ATLAS and CMS collaborations prior to the Higgs boson discovery. Basically, we got together and came up with a consistent way to present new results from the LHC, and this note documents those protocols:
https://cds.cern.ch/record/1379837
As a sweet and short tutorial I can recommend these slides: https://www.cs.toronto.edu/~hojjat/384w10/
If you want to dive into how Prolog works under the hood I can recommend https://github.com/a-yiorgos/wambook
I terms of Prolog implementations I played a bit with https://www.scryer.pl but it still feels rough around the edges.
SWI-Prolog is the most popular and most batteries included Prolog: https://www.swi-prolog.org With its libraries and documentation it is a very practical language. What surprised me is, that you can easily produce amazingly small stand-alone binaries.