I initially gave my two girls an old desktop computer with a linux on it. They started to use it before I showed them how and they started figuring stuff out without me and soon I was sitting riveted in the background watching them discover things and trying to learn about computer UIs and work out how they thought.
Years ago my kids went through a similar progression.
I set them up with a really old, crappy PC running Linux. Taught them how to tinker around on the CLI and run Zork in an emulator. They had many hours of fun.
Then set them up on the Alice programming environment on the communal/family PC (running XP), and they had fun with that for a long while, too.
Once they had free internet access, my daughter discovered Hanna Montana videos on YouTube, and the self-learning pretty much ended. Ditto w/ my son and watching video game playthroughs, though he got a little more incidental learning about computers trying to tweak whatever rig he had access in order make games perform better.
However, the internet has really empowered my kids to learn all kinds of stuff. From my son setting up Minecraft servers and fighting with port forwarding on the home router, to my daughter learning sewing techniques and guitar chords online.
Today's PC environment has come a long way from my days of exploring arcane PEEK and POKE commands from the back of a magazine on my TI-99/4A (in between rounds of Munch Man, of course). There is so much more info accessible now (both utilitarian and trivial), that I just can't grouse about where most kids run with that info. In the end, I think my kids are way better off now than I was back then, even though they're not nearly as inclined to tinker under the hood as I was.
One solution: get out of the house! :-)
We were a group of tech/dev parents struggling to get our daughters and sons interested in creating things with code. At home it was never the right time… So, we started a special family-oriented time, a kind of picnic/party for coding with our kids.[1]
We set aside 3h, 3h and half, on Saturday or Sunday afternoon, invite other families, gather in a friendly place (science center, cultural center, company office…) and have fun programming together and eating cakes :-)
We started 3 years ago[2] and never stopped!
(We have one recurring challenge: find good tools and (French) documentation to help kids that excel at Scratch or other 'kids languages' make the jump to Python or javascript. CodeCombat is not up to it, for example. That's why I personally started a generative art workshop for young kids based on Processing Python Mode, not Scratch.)
Just spent the past 3 weeks with my parents in their cabin 2 hours from civilization. Only internet connection is slow $60/4 GB 3G that's only stable in the early AM and late PM.
Many (many) years ago I was using an IBM XT. It had come from Singapore and the HD wouldn't seek when it was cold – my theory was that it'd been formatted on a very hot and humid day and it couldn't track when it changed shape in the cold.
Anyway. Big segue, but reminded me of that. If I had a task, I'd have to map it out on paper for an hour or two (while, I presume, the drive warmed up)... Don't think I've ever been as productive since!
I refrained from getting internet in my apartment for the first few months and found I was often more productive at home than I was at work.
As an extra bonus, my complex had a gym with free wifi. That meant, if I wanted to watch something from my streaming subscription, I had to work out too.
All this talk of it makes me thing maybe I should give it a try again...
Try getting them interested in things like the Arduino. Make it father-daughter time or something to teach them the basics there, to get them tinkering, get them poking at code and seeing results. Maybe get a codec chip to work with them on making a music player. Alternatively, maybe try poking them at scratch ( http://scratch.mit.edu/ ) to have them make their own things.
One of the things that I think may be going on here is that exploration now feels daunting. It's hard to get excited about getting things to print "hello world" when in another windows are bright colored animated characters engaging in absurd hijinks. However, being able to create one's own absurd hijinks may just help rekindle that spark.
I second the suggestion to get them to get started with simple hardware stuff. There's nothing nicer than getting your first little robot running.
I'd reccomend showing benefit first and then getting them to tinker with implementation, so you maybe want to preassemble the first one and demo it to them and let them play with it and program it once they're allready excited about the prospect of doing that.
Afterwards, I think the most sensible option would be to modify some lego 9V motors [shop.lego.com/en-DE/LEGO-Power-Functions-M-Motor-8883] to interface to an arduino motor driver shield. That way, they can tinker around with the structure of the robot as well, and legos lend themselves to creative tinkering.
Much later, you might want to get them into programming robotic arms or small quadrotors using C, but even if you get them intrested in it, it might take a while to get them to a skill level where they can do so.
Take heart, and permit my anecdotal counterexample -- with a happy ending.
In the mid-90s, we picked up an used Mac Plus for our 5-year-old. He loved it, had no problem playing the heck out of the games, negotiating menus, etc. (Although I did get irrationally mad when, as he was playing Spit on Saddam -- a local Apple guy had written it, so I'd loaded it for kicks -- he actually spit on the screen.)
So great, maybe I can start working in some HyperCard, give him some control. No interest whatsoever. I even hacked some of his games, to show him the possibilities of coding. He was amused, but still had no interest.
He got older and got the thrill of modem WoW with friends, but he still had no interest coding -- althoug I did catch him learning the warez culture.
About this time, I became addicted to browsing the state surplus equipment lot, and found that they had, literally, a pile of computers they sold at auction for essentially the cost of hauling it all away. In each pile, there were always some Macs. So I got to know the computer scrap guys and would buy discarded Macs for $10 a pallet -- and they thought I was a sucker.
To the point, the kid loved taking computers apart -- especially after discovered a PowerMac uprgrade card in a half laptop that I'd told him to toss. Nothing like an elementary kid earning $200 on eBay to keep him interested. He also helped me refurb/clone a bunch of Macs for his brother's preschool.
Bottom line: He learned how to handle the guts of computers, and to build networks from surplus routers so all his friends could come over for game nights, etc. But he never wrote a line of code through high school, despite my best efforts.
Then he got to college, decided he wanted to do CompSci. He was way behind in coding, but -- since it was a small liberal arts school -- the department was glad to have him and was patient. Plus he had some hands-on hardware skills that few others had (because the family computers were too expensive to tinker with, or no one knew anything about the innards).
He's now a gainfully employed SysAdmin, with a wife and mortgage and personal projects on the side -- and all his old pals still get together for game nights.
Maybe I wasn't a skilled or patient instructor, or maybe kids are just going to be what they are.
I have a 1.5 year old and you gave me some hope. Thanks.
Out of curiosity, are you describing NJ and Rutgers? One of my close friends talks about computers auctions at the NJ state school with great fanfare when I met him in college and alas after his HS experiences in the early 2000s with his professor dad he discovered they had stopped.
I've seriously considered setting up an old-fashioned automatic timer to control the power going to our DSL modem so that our Internet service is only available for an hour or so out of every day.
My worry, though, is that I'd be unplugging the modem from the timer and hooking it directly into the wall whenever I work from home, and that'd be a slippery slope to just having it hooked up all the time again.
If your router supports Tomato[1], dd-wrt[2], open-wrt[3], or similarly featured firmware, you might have a feature to restrict access at certain times of the day based on MAC address.
This would allow you to work from home, while restricting access for your children.
The only issue I might see with this setup is if your work-from-home PC is the same as the one the rest of your family uses to access the internet.
Apple's Airport Express has that built-in. http://9to5mac.com/2013/10/06/how-to-set-time-limits-on-your... states you even can set it up per device, so you can give your kids 7x24 access while making yourself more productive by limiting Internet your own access.
Let someone else set the device password, glue the network cable into your DSL modem and the Airport Express and you are all set.
I suspect internet access is what kills the joy of real work since you can be so easily distracted. Before you know it, one hour has passed and you didn't do anything.
I'm beginning to see a solution in a separate computer just for Internet access, as PG suggested several years ago. The one for work should just redirect everything to localhost or not have network access at all.
>> "I suspect internet access is what kills the joy of real work since you can be so easily distracted. Before you know it, one hour has passed and you didn't do anything."
I wonder if another part of the problem is that you can see other people's work online and give up when you see your achievements pale in comparison. I remember writing my first app - a web browser in VB - and when I did it I spent weeks working out how to copy and paste and create a menu bar etc. I even got joy from figuring out how to create a splash screen and create themes for my app. It was using a built in webview so there nothing very complicated about it but I had to read documentation and figure stuff out and it took quite a long time. I was very proud when it was finished. I can't imagine my feeling would be the same now. I could Google it, find the answer - ready to copy in to Visual Studio and run - and I would feel stupid for spending weeks on it. Then I would see someone who built a framework used by thousands of people on a weekend and I'd want to give up for good.
> Its sad but its true and I wish I knew what to do about it.
For us that have already been spoiled by modern computing, we need to be shown what useful stuff that we can do by learning about computers (obviously we would need to be on a relatively open modern computer). If most people are satisfied with whatever gets them through the more-or-less intuitive GUIs that lets them watch some videos, do they really need to know more? Well yes, when their computer eventually starts behaving strangely and simply restarting it doesn't help... but that kind of frustrating troubleshooting doesn't leave a good taste in many people's mouth when it comes to having to deal with computers.
If they are already used to modern, powerful computing, chances are that they won't be impressed by the usual computer wizardry that programmers are impressed by. Simply because they don't need, or don't see the utility in, shuffling a bunch of text around efficiently. Instead of becoming a wiz at retro computing - which might be in part because you need to be a wiz, simply because it is more primitive - what can we impress people with on modern computers that they can't do themselves easily (and would like to do)? They've already been spoiled by modern computing - for many, there is no turning back.
If the only real allure of being a computer whiz - to most people, anyway - is to be able to operate arcane technology, then the value proposition simply isn't there for people that do not appreciate computing in itself (and even that might be an acquired taste, so sort of a catch 22). It's like a classical guitar virtuoso who can be motivated to learn songs and pieces solely for the technical challenge they pose, while his students just want to gain the skills necessary to learn their favourite band's songs, or the songs that they've written themselves.
Personally, I think if you want to kindle interest in people that aren't traditionally considered tech people, the usual course of action goes from user to power user to novice programmer. It goes from using the most appearent options to using less appearent options that do it better to creating your own options and tools.
Say someone watched a lot of different streams. Show them how they can do it in VLC from one place. Show them enough that the utility of using VLC outweights using whatever method they used previously. Then, give them some time till usage of that tool becomes a habit. Then show them how to use the VLC command line options to record and convert streams so they can download them onto say their phone. Gently suggest "There's a rabit hole here, and it's filled with fun". Then give them some time to tinker around, see what they can do. Once they outgrow these shoes, when they're running out of cool addons they can install or when there's something that's just not in there yet, you can start to teach them how to write their own additions.
Now this is just a single example, but i think it illustrates the point. You want to help them slowly build momentum and a need for better tools. Most people are cocconed in a comfortable level of software usage, and they belive that their tools are sufficient, and learning more isn't going to be worth the returns in regards to effort. If you manage to sucessfully change those beliefs, their entire modus operandi regarding computers will change.
Now what tool and which lesson will get that started differs from person to person. For some it's learning VBA scripting for work, for other people it might be using Autohotkey to create macros or even learning some keyboard shortcuts ("Wait, alt-tab switches windows? That is so neat!"). Luckily, about everything that is commonly used has some kind of handy add on that extends functionality, say Reddit and RES, or getting that Junior Junior Dev to move from notepad to notepad++ and then after a good while to vim or emacs.
Give them a working 8-bit system with hundreds of games and apps from the days before the Internet. There is still just as much working 8-bit and non-Internet software out there as there ever was: you just have to find the hardware.
Then, most important of all: don't leave them alone with it. Stay there and play with them. Maybe you'll enjoy a return to the 8-bit software development days too, just for the heck of it. Disclaimer: this worked for my two boys.
Maybe attempt to work with their consumption. I they like watching kids TV shows online, show them how to download the videos and edit them. Maybe they can cut out parts they find boring at first, then move onto making a best of version from a season to eventually remixing the shows into something completely new. There's some fairly decent and easy video editing software nowadays - all drag and drop - that could be fun for them to explore, while still being able to consume.
The good thing is that this could be a gentle intro to critical thinking - ie analysing a show to say what they like and don't like and why - then move on to them creating a better version for themselves... The tech skills could come later when they bump into barriers to doing what they want - eg not enough RAM or harddrive space, so they then have to tinker to get the system to do what they want.
Straight-up banning them probably isn't productive, because yeah, it's nice to sit back and watch some TV sometimes. It might be a better policy to just limit the amount of consuming time.
My parents had limits on consumer computer/TV time when I was growing up (interestingly, both drew from the same pool of time even before internet TV was A Thing). If I wanted to hop on the computer, I'd always get asked what I was planning on doing with it (and then they'd drop by the computer station later to check in on whether I was actually doing that). If I was doing something educational, like reading up on a hobby, doing schoolwork, or programming, then I wouldn't be subject to any screen-time restrictions (other than those that came with it being a shared computer).
Though to a certain extent, kids just get stroppy if they don't get their way, and sometimes they just need to learn to deal. Which sucks for everyone involved in the short-term, but oh well. :/
There is something about 1980's-era computers which make them so much more accessable to kids than today's systems.
It's probably to do with how much simpler they are - if I can put my finger on it, I'd say it's because they don't have lots of Things which can tempt you to distraction; for example - if they had a web browser of some kind back then, you'd be tempted away from learning to program in BASIC, or learning how to load that simple-but-fun game.
They generally were single task machines that allowed you to focus on one Thing.
I'm looking at my Linux desktop right now on this machine. I have a web browser I'm using to type this reply. It also has 8 other tabs open - more tabs will be added later as I continue on my search for knowledge. I have a Konsole terminal open with IRC sessions to multiple servers and channels open. I have PyCharm loaded. I have a VM running, and more to run later. All vying for my time and energy. A child using this machine would be overwhelmed.
Even a Raspberry Pi can distract the user of it in the same manner as my desktop.
Perhaps it's time to reintroduce kids today to the CoCo2's, the VIC-20's, the C64's, the Spectrums, the ZX81's and so on. Perhaps getting kids to use single-task-at-a-time systems to learn instead of the distraction-inducing tech. of today, would be a very good idea.
When my son was 2.5 years old, I put in front of him an ancient old Compaq laptop running Debian. It had Tuxpaint running on it, and I just put it in front of him and let him go on it. Within a short space of time he was using Tuxpaint like a "pro", and then he learned how to power the machine up, and type in his login name and password. Sure, you can do this with today's systems, but they do make it so easy to provide tons of distractions.
Its the enormity of the abstractions that have to be embraced, in my opinion, which lead to distraction. There is much between a file and its handle.
And thus: Far, far more words - real, English and/or Development-oriented words, symbols - exist in all the infinite abstractions of a typical Desktop machine; whereas the 8-bit worlds' abstract taxonomy is very sparse.
There isn't really any good reason to throw away the 8-bit machines. They still work. And they still teach kids how computers work, even still today. I hope we see a return to this technology in a fashion that really compels people - young and old - to apply the ethos of those days, again. Distraction-free computing is one legacy of the earlier era.
I'm not so sure lack of distractions makes that much difference. I think on my C64 it was frequently the case that we'd switch games every few minutes. We were easily distracted with or without something specific to trigger it.
But these computers were simple enough to "mess with" and get a mental model of the whole. By the time I was 7-8, I started "experimenting" with pulling chips out of sockets (to my parents horror) and seeing what happened if I put them in different places (...). If you even could have done that on modern machines with almost everything surface mounted, you'd fry stuff. With those old machines that was usually fairly hard. Amazingly, I never burned out any chips that way. The simplicity encouraged meddling with things in a way that is harder to see younger kids do with most modern computers:
PCs are outright dangerous to let a young child dismantle on their own - whether or not you disconnect power (I'm afraid my C64's were more than once opened up while connected..); machines like the Raspberry Pi are simple and safe enough, but there's nothing to do inside them. Likewise for tablets and smart phones: If you manage to disconnect the display without breaking anything, you're pretty much at the limits.
If you try to break into the black box as a child, you're faced with an image of something that is way beyond your comprehension.
The C64 and similar home computers, on the other hand, had clearly labelled diagrams in their manuals that even a child could understand. You'd open it up, and you'd see what the different things were, and you could tinker (at least if you didn't tell your parents) and see the effects.
It seems like it is harder to become "independent": Children learn a few "magic incantations" to get to the familiar things, and the knowledge required to bridge the gaps where exploration yields results that encourage further exploration seems higher. Or maybe that's what it looked like to our parents too.
A few years older, I soldered wires straight onto the pins of the CPU of my Amiga. You could make a "pause" switch and reset button easily that way - you could solder it onto the bus exposed on the left hand side of the A500's too, but I had a hard drive attached there.
Speaking of hard drives, considering the article mentions demonstrating floppy drives with the cover off, of course I did that with my 1541 drive for the C64 (and test what happened when moving the IO chips between the 1541 and the C64 - they're "almost" the same model; enough so that the C64 at least will run; same for the one from the Amiga - their differences are small enough to be inconsequential for most programs), but also with my first (20MB) hard drive out of necessity: I bought it used, and within 6 months it started having problems starting up. The solution? Open it up, and give the motor a helping hand spinning up the platter - after that it worked fine. The drive survived another two years I think after that.
That hard drive probably had "sticktion", where the read/write head sticks to the platter surface, preventing the motor from spinning the drive up. It can happen if you leave a hard drive running for a while and then shut it down without allowing it to park its head.
Modern hard drives are clever enough to park the head when they detect the power supply turning off, so it hasn't been such a problem recently.
The recommended solution for sticktion is to lift the comptuter (or hard drive, if it is separate) about an inch off the desk, and let go.
When I was a Customer Support Engineer for SGI way back, I had to swap out a load of IBM drives on lots of SGI kit at various customer locations due to this, which was dubbed "The IBM Stiction Problem".
Power off --> remove drive --> Sharply tap the drive on desk --> replace drive + add new drive --> clone
The look on customer faces at the "Sharply tap the drive on desk" stage was as priceless as the data on the hard drive B)
A while back, my workplace had a medium-sized disc array. That is, something like a hundred UW-SCSI 72GB drives, arranged into about twelve different RAID-5 sets. When we had to power it down because of scheduled electrical work in the building, about eight of the drives failed to come up again. It's a miracle none of the RAID-5 sets were broken. Hard drives don't like being switched off if they have been going for a couple of years.
And that is why RAID is no substitute for backups, people. A reasonable proportion of events that will take out a hard drive will take out more than one.
Maybe. What I maybe didn't make clear, though, was that this wasn't a one off thing: I had to manually spin it up every time I turned it on. After the first couple of times I didn't bother screwing the cover back on, and occasionally I'd leave it off to show friends...
I know someone who got his drive working again after the drive heads had fused to the plate by slowly heating the drive in an oven while it was connected. They then proceeded to copy data off as quickly as possible. Poor mans version of what some recovery services does if the above advice of dropping/tapping on it doesn't work...
This is great. When my daughter was 11 and wanted a 'computer for her room' I gave her a VAX[1]. She had a lot of fun learning a bit of C programming and playing advent and rogue. And I had realized that for me and my generation we had these very accessible computers of that time and my kids did not. I think some of that need is being met by RasPi's and Arduinos (look at how successful the Kano Kickstarter was [2])
When you talk about folks like Gates or Jobs or Woz or pretty much anyone from the early PC days, the stuff they "learned" on was pretty straight forward. Any high school kid could write a driver for an ISA card in DOS, that is certainly possible in Linux but I find the learning curve to be much higher. And without those little triumphs to keep you going it is hard to stick with it.
[1] A VAX 4000/VLC which is a really compact and nice VAX, running netbsd.
When my son was about four years old, I gave him an old electric typewriter which I got from Freecycle. I also gave him a stack of paper and some ink ribbon.
Within days, he was "typing" away, loading new paper, working the lever to return the carriage to the start and replacing the ink ribbons. By the end of the month, it was in pieces as he tried to dismantle it to see out how it works. Luckily, I was supervising him so that he does not electrocute himself. The mind of a young child is an amazing thing to watch.
This is part of something I hope takes hold in CS education: ontogenous education. Ontogeny is the study of the development of organisms throughout their lifecycle. Technology develops in a way that often makes the present as dissimilar from its roots as a caterpillar and butterfly. So by starting at the beginning (or a beginning at least), rather than the present we give kids a full grasp of why things are the way they are rather than the millions of other ways they could be.
For many learners technology is a turn-off because it seems 'arbitrary'. It is, in the same sense that a biological organism or historical event is arbitrary. It's only with context that these things start to become intelligible. So ontogenous CS education is about giving a historical context to modern technology.
I agree with this sentiment. Do you have any good recommendations for books on the history computer science? Lots of options available, but I'm looking for a technical overview of the subject.
I completely agree with this. A lot of the games seem to appeal more too, because they've been written simply, with straightforward rules.
The BBC Master (which my children love) was also a terrific platform to learn to program on. It's just a shame that a lot of the disk drives and disks haven't survived very well.
I've been doing a similar thing with my 6 year old who wanted to learn to programming. I've been using JSBeeb (a browser based BBC Micro emulator) and have started to put together a Coding for Kids "book" based on our "lessons".
It's still very rough and only partially complete but I'd appreciate any feedback http://c4k.rabidgremlin.com
Fire up some Defence Force, Harrier Attack, Doggy, or Zorgon's Revenge, from the good old days! YAY, 8-bit party! Space 1999, 1337, Pulsoids, Skool Daze .. STORMLORD! W00t!
:)
What's really great, is that the 8-bit days are not over. I see this now, with my kids getting attracted very much to programming on the 8-bit machines. "10 PING:WAIT 10 20 GOTO 10", represent!!
A big difference emerges in how young folks develop their skills are developed when spending one's time as a creating vs. consuming novelty/distraction in mindless screen time.
We know the path of seeking/consuming novelty and distraction is a limited way of engaging both abstract and critical thinking skills when imagining how the hardware/software/network stack comes together. There's lots of studies on gaming (I was a voracious gamer until it got in the way of my creation and learning time), but I'd wonder if spending time in someone else's virtual worlds helps people get out there and solve real world problems.
In a way, learning by seeing, and mastering a C64 is a timeless experience, only we don't have many C64's today as true, starter computers.
It's where projects like Arduino, Raspberry Pi, and others are so exciting, but there's little like seeing a physical floppy drive spin up and being witness to so many more steps in slow motion.
I've long thought that the original GameBoy Advanced would make a great platform for kids to learn about hardware. Being solid state, it doesn't have the physicality that OP describes, but it does have a very understandable hardware and software model. Basically, 100% of the machine can be controlled via memory-mapped structs. No need to call a magical function from some Nintendo-sanction wizard. You can stomp the bits yourself and make stuff happen. I've written a toy program to draw on the screen in half a page of C. No includes, no libs. int main, vidmode=3, pixels[n]=255;
I owned an Spectrum, Atari St, Amiga, and Apple II. I learned 68K assembler and BASIC on them, to load programs from tape...
...and I hated them and got rid of them as soon as I could. Seriously, I don't understand how people love to use floppy disk or tapes that sounds like the machine is doing coffee or something, takes minutes to load a single program, no 3D, no Internet, requires a TV that emits X rays, small and ugly.
If I want my kids solving something hard, I will give them robots or 3D printers, not trying to force them to live the life that I lived 30 years ago.
The fact that we live in multiple thread environments does not mean that we have to use them.
The same could be said about Internet, people say "there was no Internet distractions" when you could disconnect the network in any computer with a 3 seconds click.
In fact what threads do is make your life easier. I did program a multimedia program in DOS using INTs and it was the definition of hell.
Now you could make a simple audio thread, another simple picture animation thread and the more complicated thing you deal with is managing mutexes, that is super easy if you understand them.
multithreading programming is not "super easy." threads/parallelism give us power, but it definitely increased complexity and makes it harder to reason about the behavior of programs.
This brings me back to my junior high days, when I bought a Centris 650 Mac at a yard sale for the express purpose of installing OpenBSD. That was my first real delve into open source operating systems, and it deserves much of the credit for my professional developer career.
Nice post. I loved my CoCo2 back in the day. I doubt their enthusiasm will last if they have alternatives, but it is fun to see the exploration, even if it is just for a little while. Planting the seed is important.
My Amiga 1200 is still working. I had souped it up with a VGA adapter, a 68040 card, amongst other things. I had to adapt a PC PSU to feed it the necessary power after that.
It has been with me to Japan for 6 years. I forgot to reset the voltage from 110v to the UK's 240v and the magic white smoke from the capacitors escaped the PSU.
But the Amiga itself survived to tell the tale and is working happily with the new PSU ;)
But with the CD route, I trust them with something that cost a few cents, and with an interface they already know (play, stop, and next track buttons). Way easier for them and far fewer consequences for trips or spilled water!
Here's an old blog post about it: http://williamedwardscoder.tumblr.com/post/19500788060/my-te... - I think its a fun read.
Fast forward to now; that blog post is hopelessly out of date!
I gave them old-but-decent laptops and, eventually, internet access.
As soon as they had internet access they stopped tinkering and exploring and started only using the laptop to watch repeat episodes of childrens TV.
And now they often want to use their mum's iPad - to play music and watch TV - but they are completely utterly uninterested in tinkering with any PCs.
Its sad but its true and I wish I knew what to do about it.