Hi, David Tristram here. founding member of Raster Masters, 1990's computer graphics performance ensemble. As @hopkins has mentioned, we used high end Silicon Graphics workstations to create synthetic imagery to accompany live music, including notably the Grateful Dead, Herbie Hancock, and Graham Nash.
After many iterations I'm currently working mainly in 2D video processing environments, Resolume Avenue and TouchDesigner. The links here are inspiring, thanks for posting.
Who were the other people in Raster Masters, and what crazy stories from Grateful Dead concerts can you tell? ;)
Every time I've ever plugged in a modern projector into a laptop at a presentation it's so stressful, like rolling the dice if the screen will ever come up. What kind of a projector and calibration and preparation did it take to project live hires SGI video onto the screen above the band?
>RGB color separation and processing is obtained using vertical wobbulation of the electron beam on the oil film to modulate the green channel and sawtooth modulation is added to the horizontal sweep to separate and modulate Red and Blue channels. The optical system used in the Talaria line is a Schlieren optic like an Eidophor, but the color extraction is much more complex.
>A wobbulator is an electronic device primarily used for the alignment of receiver or transmitter intermediate frequency strips. It is usually used in conjunction with an oscilloscope, to enable a visual representation of a receiver's passband to be seen, hence simplifying alignment; it was used to tune early consumer AM radios. The term "wobbulator" is a portmanteau of wobble and oscillator. A "wobbulator" (without capitalization) is a generic term for the swept-output RF oscillator described above, a frequency-modulated oscillator, also called a "sweep generator" by most professional electronics engineers and technicians.[1] A wobbulator was used in some old microwave signal generators to create what amounted to frequency modulation. It physically altered the size of the klystron cavity, therefore changing the frequency.
Infrared Roses is a live compilation album by the Grateful Dead. It is a conglomeration of their famous improvisational segments "Drums" and "Space". The ElectroPaint stuff begins around 11:00, but the Raster Masters did all kind of different stuff in parallel and mixed it all together in real time. I remember them describing some "recursive texture map" feedback too, which only ran on high end SGI workstations.
Electropaint on SGI Indy: A capture of the great screensaver electropaint on an SGI Indy. There is no sound (originally I had Mahavishnu Orchestra's "Miles Beyond" but youtube flagged me for it), but feel free to blast your own music while watching:
>In the spirit of J-Walt's intro message, I'm David Tristram, somewhat of a pioneer in the use of real-time graphics for live performance. Author of Electropaint and Electroslate live performance instruments, and founding member of Raster Masters. Toured with Grateful Dead, developed performance system for Graham Nash and Herbie Hancock.
>I'm just playing with things these days, most recently making music with a small modular system and experimenting with very simple looping visuals in an investigation into the perception of visual rhythms. Here is my most recent test.
If anyone wants to play around with psychedelic graphics without going too low-level, [hydra](https://hydra.ojack.xyz/) is a cool javascript based livecoding environment with a gentle learning curve.
I've been working on an free open-source macOS app for just that - https://nottawa.app Hoping to release in the next couple months!
The UI has been greatly improved since I took the original demo on the site, the real thing is MUCH better now. Same base idea - chain together shaders, videos, or webcams and then drive their parameters via an audio signal, BPM, oscillator, MIDI board, or manual sliders.
The beta link on the site isn't really worth trying yet - if you're interested in getting on the TestFlight just shoot me a message at joe@nottawa.app. Would love some HN feedback :)
The code isn’t anything to write home about, it’s in C++ leveraging OpenFrameworks and OpenGL. I’m an iOS and macOS dev, but after the initial release I’ll get started on porting to Windows and Linux. OF generally works well multi-platform so I’m hoping it won’t be too hairy.
I’m specifically targeting the non-technical artist/creator market, ideally with optional macOS App Store distribution. I’ve been involved in the live visuals scene in NYC a bit and something I commonly heard was that musicians and DJs wanted visual accompaniment which just works out of the box. TouchDesigner etc are incredibly powerful, but generally out of reach for non technical folks.
I’ve contracted a great artist from UpWork who’s been making presets which will be included. There should ideally be as little friction as possible for a user to go from first launch to live, audio-reactive visuals.
Hydra actually works well with music input! It grabs audio from the mic and `a.show()` will show you the frequency bins. Then any numerical parameter can be modulated by the intensity of a bin, for example:
Is it possible to grab from default audio output device instead of mic? Probably not as it's browser based. I suppose mic can be faked on OS level somehow.
I used to spend so much time messing around with MilkDrop in Winamp. You could grab existing visualizations and see what they were doing, and make your own edits. Thanks for the nostalgia hit!
Regarding the OP doc and UV coordinates. A major area of investigation for us back in the day was finding interesting ways to displace the uv texture coordinates for each corner of the rectangular mesh. We used per-vertex colors, these days one would use a fragment (pixel) shader like those in ShaderToy.
A very interesting process displaces the texture coordinates by advecting them along a flow field. Use any 2D vector field and apply displacement to each coordinate iteratively. Even inaccurate explicit methods give good results.
After the coordinates have been distorted to a far distance, the image becomes unrecognizable. A simple hack is to have a "restore" force applied to the coordinates, and they spring back to their original position, like flattening a piece of mirroring foil.
Just now I am using feedback along with these displacement effects. Very small displacements applied iteratively result in motion that looks quite a bit like fluid flow.
That was how Jeremy Huxtable (inventor of the original NeWS "Big Brother" Eyes that inspired XEyes) PostScript "melt" worked: choose a random rectangle, blit it with a random offset, lather, rinse, repeat, showing how by repeating a very digital square, sharp, angular effect, with a little randomness (dithering), you get a nice smooth organic effect -- this worked fine in black and white too of course -- it's just PostScript:
%!
%
% Date: Tue, 26 Jul 88 21:25:03 EDT
% To: NeWS-makers@brillig.umd.edu
% Subject: NeWS meltdown
% From: eagle!icdoc!Ist!jh@ucbvax.Berkeley.EDU (Jeremy Huxtable)
%
% I thought it was time one of these appeared as well....
% NeWS screen meltdown
%
% Jeremy Huxtable
%
% Mon Jul 25 17:36:06 BST 1988
% The procedure "melt" implements the ever-popular screen meltdown feature.
/melt {
3 dict begin
/c framebuffer newcanvas def
framebuffer setcanvas clippath c reshapecanvas
clippath pathbbox /height exch def /width exch def pop pop
c /Transparent true put
c /Mapped true put
c setcanvas
1 1 1000 {
pop
random 800 mul
random 600 mul
random width 3 index sub mul
random height 2 index sub mul
4 2 roll
rectpath
0
random -5 mul
copyarea
pause
} for
framebuffer setcanvas
c /Mapped false put
/c null def
end
} def
melt
Here's Jeremy's original "Big Brother" eye.ps, that was the quintessential demo of round NeWS Eyeball windows:
Oh sorry I didn't explain: they're interactive PostScript scripts for the NeWS window system, so they don't actually print, they animate on the screen! The "pause" yields the light weight PostScript thread and lets the rest of the window system tasks run, and NeWS had an object oriented programming system that was used to implement the user interface toolkit, window managements, interactive from ends, and even entire applications written in object oriented PostSCript. NeWS is long obsolete, but you can run it in a Sun emulator!
It uses an iterated feedback pixel warping technique kind of like melt.ps, to spin the pizza rotationally, which melts the cheese and pizza toppings, instead of melting the screen by simply blitting random rectangles vertically like melt.ps -- note the randomization of the rotation to "dither" the rotation and smooth out the artifacts you'd get by always rotating it exactly the same amount:
% Spin the pizza around a bit.
%
/Spin { % - => -
gsave
/size self send % w h
2 div exch 2 div exch % w/2 h/2
2 copy translate
SpinAngle random add rotate
neg exch neg exch translate %
self imagecanvas
grestore
} def
It animates rotating a bitmap around its center again and again as fast as you "spin" it with the mouse, plus a little jitter, so the jaggies of the rotation (not anti-aliased, 8 bit pixels, nearest neighbor sampling) give it a "cooked" effect!
It measures the size of the pizza canvas, translates to the center, rotates around the middle, then translates back to the corner of the image, then blits it with rotation and clipping to the round pizza window.
I love how easy it is to write shaders that operate on images in HTML. My skills in this area are mediocre but I love seeing how far people can take it. Even providing a simple approximation of a depth map can really make the results interesting.
Some years ago I did a similar project to smoothly crossfade (with "interesting effects") between images using some of the same techniques. My writeup (and a demo):
to the static image. Having experienced a range of psychedelic experiences in my life this appears to be the closest visually with the real thing, at least at low, non-heroic, doses. Maybe slow the waves down and lessen the range of motion a bit.
Note: I am far more interested in replicating the visual hallucinations induced by psychedelic compounds than by making cool visuals for concerts and shows, utmost respect for both sets of artists though.
There is an artist (and I’m sure many more) who does a fantastic job with psychedelic visuals using fully modern stacks to edit, unfortunately their account name entirely escapes me. I’ll comment below if I find it.
The comparison that I would make with this portion of the Rolling Hills article would be the mushroom tea scene from Midsommar, specifically with the tree bark. The effect of objects “breathing” and flowing is such a unique visual and I love to see artists accomplishing it in different ways.
It's probably not who you were talking about, but this account on YouTube does a good job of representing the visual experience, while also talking about other effects. The videos looking at nature, and the way the visuals start to form geometric patterns, and that "breathing" effect are powerful. The author covers various substances, and how the effects can be minor (slight "breathing" or pulsing of surfaces), to full geometric "worlds" (such as from DMT - although I've never dipped into that substance).
That is not who I had in mind but after looking through their account I’m going to binge their videos, very cool stuff. I always found that studying the minute differences in these substances is such a genuinely interesting topic. It’s covered a lot in Mike Jay’s Psychonauts.
Early 90s, Todd Rundgren realized a Mac App called Flowfazer - it didn’t simulate your experience but was helpful as a distraction to move you along. Some people used it to provide guidance for their own creations.[2]
If this is your kind of thing and you ever get a chance to see the musical artist Tipper alongside Fractaled Visions driving the visuals, you’re in for a treat.
Most spot on visual depictions of psychedelic artifacts I’ve witnessed.
Saw them together last year and it’s the no. 1 artistic experience of my life. The richness, and complexity of Fractaled Vision’s visuals are almost unbelievable.
Even knowing a lot about shader programming, etc. some of the effects I was like “wtf how did he do that”.
Here’s the set, doesn’t fully capture the experience, but gives a feel: Seeing this in 4k at 60fps was next level.
This video (back in the Flash days) is how I discovered the electronic group Shpongle. Their remix of Divine Moments of Truth is used in this animation. I believe the version is the "Russian Bootleg" version. I had been into electronic music before this, but this genre of electronic really blew my mind when I heard it.
I've been writing webgl shaders at work this week and noodling with the details to make things look like physical camera effects but occasionally I'll get something wrong and see results that look similar to the stuff in this article and I have to say it is just so much more fun than the standard image effects.
Sure there might be limited use cases for it visually but playing with the models we've built up around how graphics in computers work are a great way to learn about the each one of these systems. Not just graphics but fundamental math in programming, how GPUs work and their connection to memory and CPUs, how our eyes work, how to handle animation/time, and so on.
This might have been written just for me, I love the premise.
I am truly fascinated by people who attempt to reproduce the actual physiological vision effects of psychedelic drugs.
Psychoactive drugs can be probes into the inner workings of our minds - in some scientific sense - and exploring the vision effects seems likely to suggest interesting things about how our visual system works.
Mostly, I am just impressed when anyone is able to capture the visual experience in graphical effects, with any level of realism.
> Mostly, I am just impressed when anyone is able to capture the visual experience in graphical effects, with any level of realism.
I have to say that the cliche of super bright, super saturated, geometric or melty shapes like in the article are not a great reproduction of the typical visual effects of psychedelics. Apart from very high doses, the visual effects are much more subtle.
This is 100% not what psychedelics look like. It's generally just mildly more saturated colours and the feeling that everything is possibly breathing or swaying in a more natural way. I dunno what happens if you take insane amounts tbf. I always thought that psychedelic art was a bit more about the sort of thing that is super appealing to look at while tripping.
Maybe the most "scientifically accurate" replication of psychedelics are in these "DeepDream" images.
They were originally made to debug neural networks for image recognition. The idea is run the neural network in reverse while amplifying certain aspects, to get an idea on what it "sees". So if you are trying to recognize dogs, running the network in reverse will increase the "dogginess" of the image, revealing an image full of dog features. Depending on the layer on which you work, you may get some very recognizable dog faces, or something more abstract.
The result is very psychedelic. It may not be the most faithful representation of an acid trip, but it is close. The interesting part is that it wasn't intended to simulate an acid trip. The neural network is loosely modeled after human vision, and messing with the artificial neurons have an effect similar to how some drugs mess with our natural neurons.
Fun thing: in relativity u,v are typical variable names used for a really funky coordinate transformation that mixes space and time, sometimes called Penrose coordinates [1]. So when I saw this:
> uv.x = uv.x + sin(time + uv.x * 50.0) * 0.01;
> uv.y = uv.y + sin(time + uv.y * 50.0) * 0.01;
I thought, wow, what on Earth is going on here? But no, it turns out that its not that psychedelic. They could have used p,q or any other variable pair but its still quite interesting geometrically [2].
Slightly offtopic: Is there a way to do create meshes and animate them directly inside blender, pragmatically? Sort of like shadertoy, but instead of drawing, sculpting and rigging manually, I write some code that generates meshes and run shaders on them for effect?
Blender's deeply extensible and largely written in Python, but it also has full blown visual node programming language for procedurally modifying and generating textures, shaders, 3d geometry and meshes and parametric objects, etc!
Actually Blender has an abstract base "Node" set of Python classes and user interfaces that you can subclass and tailor for different domains, to create all kinds of different domain of application specific visual programming languages.
So visually programming 2d video filters, GPU shaders, 3D geometry, animations, constraints, state machines, simulations, procedural city generators, etc, and each can have their own compilation/execution model, tailored user interface, node libraries, and connection types. Geometry nodes have the visual programming language equivalent of lambdas, functions you can pass to other functions that parameterize and apply them repeatedly, iterating over 3d geometry, texture pixels, etc.
Blender extensions can add nodes to the existing languages and even define their own new visual programming languages. So you can use a bunch of integrated tightly focused domain specific visual programming languages together, instead of trying to use one giant general purpose but huge incoherent "uber" language (coughcough Max/MSP/Jitter cough).
Here's a paid product, an incredibly detailed and customizable city generator (and traffic simulator!) that shows off what you can do with Geometry Nodes, well worth the price just to play with as a video game, and learning geometry nodes:
Using The City Generator 2.0 in Blender | Tutorial:
Really interesting! I'm very much interested in pychedelic graphics. I played around with shadertoy a little bit maybe I should give it another go.
For anyone interested I made some cool visuals by interpolating prompts in stable diffusion 1.5 like https://m.youtube.com/watch?v=ajfMlJuDswc. I found that the older diffusion models are better for abstract graphics as it looks more "raw" and creative.
A film by Jim Crutchfield, Entropy Productions, Santa Cruz (1984). Original U-matic video transferred to digital video. 16 minutes.
James P. Crutchfield. Center for Nonlinear Studies, Los Alamos National Laboratories, Los Alamos, NM 87545, USA.
ABSTRACT: Video feedback provides a readily available experimental system to study complex spatial and temporal dynamics. This article outlines the use and modeling of video feedback systems. It includes a discussion of video physics and proposed two models for video feedback based on a discrete-time iterated functional equation and on a reaction-diffusion partial differential equation. Color photographs illustrate results from actual video experiments. Digital computer simulations of the models reproduce the basic spatio-temporal dynamics found in the experiments.
1. In the beginning there was feedback ...
James P. Crutchfield. "Space-Time Dynamics in Video Feedback." Physica 10D 1984: 229-245.
If we're sharing, this is my effort at psychedelic graphics - animating a gradient over a live video feed (all done using a boring 2D canvas, because I don't have the brain capacity for shaders) over on CodePen: https://codepen.io/kaliedarik/pen/MWMQyJZ
a bit of a tangent but I'm surprised how heavily visualisers and the like always seem to focus on packing in as much colour as possible. With OLED screens it feels like there's a ton of potential for making really great black-heavy ambient visuals that so an idle TV can become a feature of a room's decor rather than just a big black rectangle in the middle of it.
Yeah, I have an older LG and it has a disappointingly simple 4k fireworks visualization when it "sleeps" that always makes me wish I could create a custom replacement.
I write semi-psychedelic paint and video mixing software for personal use. Here’s a video from last year of mixing a few things together, hopefully some here enjoy it :)
Ben - so glad I stumbled on this article. Love this kind of graphical stuff (I'm a huge sucker for psychedelia) and I really enjoyed your videos on your channel. Thanks for sharing!
>DonHopkins 11 months ago | parent | context | favorite | on: John Walker, founder of Autodesk, has died
>I really love and was deeply inspired by the great work that John Walker did with Rudy Rucker on cellular automata, starting with Autodesk's product CelLab, then James Gleick's CHAOS -- The Software, Rudy's Artificial Life Lab, John's Home Planet, then later the JavaScript version WebCA, and lots of extensive documentation and historical information on his web page.
CelLab:
>I'm amazed that my beloved CHAOS still runs beautifully on emulators like DOSbox. It was the last programming project where I could completely roll my own interface - and maybe my last really fun one.
Here's some stuff I did that was inspired by Rudy Rucker and John Walker's work, as well as Tommaso Toffoli and Norm Margolus's wonderful book, "Cellular Automata Machines: A New Environment for Modeling":
by DonHopkins on Aug 7, 2023 | parent | context | favorite | on: My history with Forth and stack machines (2010)
>"Cellular Automata Machines: A New Environment for Modeling" is one of my favorite books of all time! It shows lots of peculiarly indented Forth code.
https://donhopkins.com/home/cam-book.pdf
I performed it in real time in response to the music (see the demo below to try it yourself), and there's a particularly vivid excursion that starts here:
The following longer demo starts out with an homage to "Powers of 10", and is focused on SimCity, but shows how you can switch between simulators with different rules and parameters, like setting rings of fire with the heat diffusion cellular automata, then switching to the city simulator to watch it all burn as the fires spread out and leave ashes behind, then switching back to another CA rule to zap it back into another totally different pattern (you can see a trail of destruction left by not-Godzilla at 0:50 while the city simulator is running).
I had to fix some bugs in the original SimCity code so it didn't crash when presented with the arbitrarily scrambled tile arrangements that the CA handed it -- think of it as fuzz testing; due to the sequential groups of 9 tiles for 3x3 zones, and the consecutive arrangements of different zone type and growth states, the smoothing heat diffusion creates all these smeared out concentric rings of zones for the city simulator to animate and simulate, like rings of water, looping animations of fire, permutations of roads and traffic density, rippling smokestacks, spinning radars, burbling fountains, an explosion animation that ends in ash, etc.
Chaim Gingold's "SimCity Reverse Diagrams" visually describes the SimCity tiles, simulator, data models, etc:
You can play with it here. Click the "X" in the upper left corner to get rid of the about box, use the space bar to toggle between SimCity and Cellular Automata mode, the letters to switch between cities, + and - switch between tile sets (the original SimCity monochrome tiles are especially nice for cleansing the palette between blasts of psychedelic skittles rainbows, and the medieval theme includes an animated retro lores 8 bit pixel art knight on a horse), the digits to control the speed, and 0 toggles pause. (It's nice to slow down and watch close up, actually!):
As you can see it's really fun to play with to music and cannabis, but if you're going to use any harder stuff I recommend you get used to it first and have a baby sitter with you. Actually the whole point of my working on this for decades is so that you don't need the harder stuff, and you can put it on pause when you mom calls in the middle of your trip and you have to snap back to coherency, and close the tab when you've had enough.
I've had countless LSD, DMT, and mushroom trips, as well as a few others, and nothing about that video reminds me about anything "psychedelic", not even close. I guess it's subjective but I have to wonder what was going on the their head when they decided "this looks psychedelic, let's call it psychedelic".
>Barlow's paradigm seems cheeky without awareness of the Net's public roots, how it came up through BBS and Fidonet culture, is forgotten by those who only saw the view of the Net as a gift from the ivory towers of academia and the military rather than bedroom z80 & 6502 modem culture.
>In another comment reply to Gumby, I mentioned how I often accidentally call them "Grateful Dead Conferences", because so many tech people I knew and worked with in Silicon Valley and the Free Software community and regularly saw at computer conferences and trade shows would show up at Dead shows.
>The Raster Masters would lug enormous million dollar high end SGI workstations across North Shoreline Boulevard from SGI headquarters to Shoreline Amphitheater, and actually pack them into trucks and travel on tour with the Dead, performing live improvisational psychedelic graphics on the screen behind the band in real time to their live music, using an ensemble of custom software they wrote themselves, mixing together and feeding back the video of several SGI workstations in real time.
>At one concert, some hippie came up to me, pointed at the graphics on the screen behind the stage in awe, and said, "I took all these shrooms, I'm tripping my balls off, and you would not fucking believe what they're making me seeing on the screen up there!!!" I explained to him that I hadn't taken any shrooms, but I could see the exact same thing!
>The Raster Masters wrote and performed their own software, which reflected the taping and sharing culture of the Dead scene, including ElectroPaint and the Panel Library from NASA, whose source code and recorded live performances were distributed with SGI's demo software and free source code library.
>The improvisational software was like a musical instrument performed in real time along with the music.
[...Lots more stuff with links and videos at the link:...]
After many iterations I'm currently working mainly in 2D video processing environments, Resolume Avenue and TouchDesigner. The links here are inspiring, thanks for posting.