Hacker Newsnew | past | comments | ask | show | jobs | submit | more jdswain's commentslogin

I liked this comment because it said what I was basically thinking. In my last job I kept everything seperate, a lot of people didn't and when we all got made redundant, I just closed my work laptop and walked away, some of the others were scrambling to disentangle everything and had all sorts of problems.

But in my new job I'm doing work on my personal laptop. It started because, I travel by plane to work regularly. I was carrying three laptops, my personal one, my work one, and one provided by my current client. It was just so much easier to combine everything into one laptop and just carry that. It's working out really well. Before I was constantly moving from one laptop to another just to check messages.

I think doing work on your personal computer is less bad than having personal stuff on your work computer, I wouldn't do that.


I got an 8" Android tablet instead of an iPad mini. What I wanted, was to have something really compact that I could use emacs on, mainly for org-roam, notes and writing in general, not for writing code. It works well with termux, I don't think there is a good way to have a local version of emacs on iOS.

The keyboard is the most important part really (although I did want a good screen too). I'm on my second keyboard, they are only about $30 each, which is better than iPad prices. The first one wasn't so convenient to unfold quickly, the new one is working really well.


The trouble is there is no way to turn it off. I've nothing against that kind of thing in the right place, but for me Facebook is not that place, and it sneaks in no matter how hard you try and prevent it.

Here's some funny fail videos...of girls in bikinis. Here's some sport images for the sport you are interested in, with far too revealing angles/images.

So I don't use Facebook any more, and feel much better for it.


I've come to a similar conclusion lately. I used to spend a lot of time jumping on my latest idea, researching and even writing a bit of code to get started. I'm much better off writing my ideas down, hopefully in a form where I can consolidate the ideas into a few projects, but even if it just stops me wasting time on an idea that I'll lose interest in soon, it's worthwhile.

On the other hand, I've got one large ongoing series of projects that are relatively pointless, and that is the whole point of them. I'm learning from them and I don't have the pressure of completing anything (although I would really like to). From time to time they have been really important for my mental health, and I value them for that.

I'm also quite aware now that I need to feel achievement to keep me motivated, and if I have a plan and can check off an item on that plan then that counts as a good day. Work doesn't always give me that sense of achievement, so having something else that does can be really valuable.


I think the built-in keyboard idea had run its course, so any possible Apple IV would probably end up looking a lot like an Apple IIgs. The little keyboard extension on the IIgs case is kind of reminiscent of the Apple ///.

It would have been good to end the line with what was planned for the Mark Twain, internal floppy and hard disks would have made the whole system a lot better. And of course if we could have got a 14 MHz 65C816 as well then it would have been a really interesting system.


It can still look like a /// with a detached keyboard that’s a continuation of the main case. Commodore did that with their office machines (the rounded ones).

I think the beige is important as part of the apology ;-)

Frankly, I’d accept a beige Mac Mini with a rainbow Apple logo on top and call it a day.

I’m sure Tim Cool lurks around these pages.


Given how there's plenty of market for retro nostalgia kit, I'm surprised nobody is making a modern desktop PC that pushes the same aesthetic buttons.

Beige paint is no more expensive than white or black. A desktop form factor would be good for today's super-heavy video cards. Since we no longer design around internal optical drives, etc, you could probably whip up something micro-ATX that looked a lot like an IIgs or the IIsi mockup, maybe with some USB ports in the front "lip" for convenience sake.


> Tim Cool

Fun typo for Tim Apple. ;-)


It slowed down to 1 MHz for I/O and Apple ][ compatibility.

I wouldn't call it a disaster, sales and marketing wise mainly, but that also had a lot to do with the IBM PC coming out around the same time.

It was probably the most complex 6502 design, and mainly consisted of discrete logic chips rather than custom chips that other manufactures were starting to use. It had advanced features like an additional addressing mode to access up to 512k RAM without bank switching. (Plus two speed arrow keys)


Rockwell and WDC had 65C02's up to 4 MHz relatively early on, but the 4 MHz versions seemed to be quite rare. WDC now has 65C02's rated at 14 MHz, but they go quite a bit higher than that if you've got fast enough RAM.

There are some technical details on why a 4MHz Z80 is roughly equivalent to a 1MHz 6502. As always with processor design there are tradeoffs in every decision. The Z80 had a 4-bit ALU, but I'm not sure if that slows it down.

The Z80 has a more complex architecture than the 6502. A 6502 clock cycle is one bus cycle and simple instructions can execute in one clock cycle. For the Z80 a clock cycle is called a T-state, and one machine cycle consists of multiple T-states. A simple instruction like INI takes 4 T-states.


Not quite: the design is such that even the fastest 6502 instructions are two cycles. Still, generally quicker clock-for-clock than a Z80.


There is also some tiny pipelining at play: the 6502 needs to access (read or write) memory on every clock cycle. A 2-cycle instruction reads the opcode on the first and the actual work is done on the second. This leaves the second cycle's memory access open for doing something cool like fetching the next instruction's opcode.


People used to get productive work done on DECstations, they were big and expensive in their time. Now we can recreate them for just a few dollars (plus the cost of a screen and keyboard). Today almost everything we do relies on the internet, so a wifi driver would be useful as well.

Many things we do today require more processing power, but many things do not. Writing, terminals (well SSH could be a problem), email, hn. We used to do raytracing on a DECstation, had to use a remote X window to view the finished image in colour.

You would think that a certain subset of people would quite like a simpler system today to work on, but I guess it's just easier to buy something modern with all the extra layers of complexity.

Maybe this is because today programming largely relies on having access to the accumulated knowledge of the internet, and a very complex web browser.


My PhD was done in a DECstation 3100. The physics lab was a VAX environment (everyone had VTxxx terminals in their desk) but someone had bought a 3100, not figured out how to use it, and it was sitting in a corner - normally switched off. I managed to persuade them I could put it to use when I joined the group, and about 6 months later everyone else in the group had Unix workstations too… we named them all after asterix characters, mine was getafix.


The article title reminded me of when I was young and used to read Byte Magazine. Byte used to cover a wide range of topics, and could get quite technical, but the big thing that is vastly different to today is that you would get a monthly digest of articles that were selected by the editors, not by yourself. And I used to read it cover to cover. There was a lot I didn't understand, but also I feel like I gained a wider knowledge than if I only read what I was interested in, and many times the ideas that I was exposed to turned out to be useful much later in life.

Some of them ended up being distractions too, like playing with hardware, or writing a compiler, but it was all very interesting.


Byte magazine was a terrific publication. There's nothing similar in print these days that I'm aware of. Certainly, Byte couldn't be accused of dumbing down the content to reach a wider audience, unlike many of today's supposedly technical magazines. I learned a lot from Byte and experimented frequently with the knowledge and understanding I gained from Byte.


I wrote a lot of basic and 6502 assembler code inspired by Byte. So much learning.


No, and there was no (easy) way to detect the vertical retrace. For a lot more on that topic have a look at the Apple II mouse card, they needed to synchronise with the video and did work out a software based way of doing it, but the final product added hardware to make it possible.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: