This, to me, is probably the most interesting thing about Sandstorm as it relates to open source web apps. Today, I can write an open source web app and post it to github, but it takes someone with operational knowledge to deploy and run it. (even if that's just "create a heroku account, create a new app, git push" you still need to know something about development) But it feels like open source web app distribution has been stagnant and possibly even a lull since the wordpress/drupal/etc php heyday.
My hope is that Sandstorm rejuvenates open source and indie web app development by providing a channel where end users can easily and safely run random web apps they find.
Since you give the example of GitHub... Sandstorm itself doesn't use their own Gitlab app to host and collaborate. How can they convince others? Doesn't make me confident of their platform. Centralized platforms like GitHub exist for a purpose.
When the Sandstorm project started, we obviously couldn't host it on Sandstorm. But, we are gradually moving towards more dogfooding. For example:
- We use Etherpad on Sandstorm for all documents we write, e.g. design docs or project plans.
- We use Wekan on Sandstorm for project / task planning.
- We use Rocket.Chat on Sandstorm for internal team chat.
- We commonly share and publish files using Davros.
- We host docs.sandstorm.io -- and the analytics for docs.sandstorm.io -- directly from Sandstorm.
- We host the Sandstorm app index (back-end for apps.sandstorm.io and for the automatic app update pipeline) as a Sandstorm app.
- We host the purchase flow for Sandstorm for Work as a Sandstorm app.
- We sometimes use private Gitlab or Gitweb grains on Sandstorm for non-public code, or for working on security fixes before they are ready to be disclosed.
That said, we still need to:
- Switch the main Sandstorm source repo to Gitlab or similar. This will obviously be a disruptive change, so we haven't tackled it yet.
- Host our e-mail on Sandstorm. Sandstorm's e-mail support admittedly needs a lot of work -- it's a much more complicated problem than most of the other things one does on Sandstorm, and we haven't put much effort into it yet. We'll get there eventually.
- Host our CI server on Sandstorm. This requires packaging Jenkins as a Sandstorm app, and also has a bunch of other complications relating to the fact that some of our tests need to run in VMs or use privileged syscalls, so it might take a while.
- Host our main web site on Sandstorm. Currently it's hosted as a plain-old static file server that we maintain. We could switch but there wouldn't be a ton of benefit in doing so, other than to say that we did.
If we can help to your use GitLab CI to test your app we would be happy to. Since GitLab already runs on Sandstorm I think we can do it relatively easily compared to Jenkins. Of course the privileged syscalls still are an issue. But maybe we can host the GitLab Runner on a non-Sandstorm box to work around that.
But the browser itself is an application delivery platform, and it includes a sandbox. So I'm curious why one would need something like Sandstorm on top of that to run apps?
Many (most?) apps need server-side logic. Apps that have multiple permissions levels need to enforce those permissions somewhere. Apps that support real-time collaboration need to maintain the authoritative copy and coordinate the OT stream. Etc.
Online services are never going to be truly free-of-charge. (If you're not paying for the product...) In many cases the business plan is the reason that server interaction is necessary. However, there exists F/OSS that is truly free-of-charge for the typical user, so it seems feasible that many browser apps could be like that too, which would often obviate the server. For the example you cite, every client needs to agree eventually, so they could all keep their own copy.
Sorry, but you're massively underestimating the problem here. There's a whole lot more to it than "they could all keep their own copy."
Reliable distributed consensus without a "master" node is one of the hardest problems in computer science even when all the computers are always-on servers located on the same network. When we're talking about random laptops and phones all over the world that flicker in and out of connectivity -- and you want real-time collaborative responsiveness, and you have users who are not allowed to see the whole data set -- you're deep in unsolved problem territory.
If and when someone builds a real system that can handle this, performs well in real-world network scenarios, and is easy enough for typical application developers to use... then we can start the arduous process of rewriting all of our applications to use this model.
Instead of all this, Sandstorm aims to make it easy for everyone to have their own general-purpose server (maybe by using their own physical machine, maybe by paying a hosting service). This way we can keep using our existing code and existing application development practices.
All true, except you're really just hiding the problem behind an abstraction. If you want to have a reliable server instead of just a server, then you want more than one server. And now you're back to your massively complicated problem, if you insist that all servers are equal.
Of course, you don't have to make that assumption, neither on the server side nor in the peer model. Just say that the peer who created the document is the "server" (or use any other reasonable way to determine this), and when other peers have a copy that disagrees, then they update their copy. Now it's just like talking to a real server, except it happens to be another peer. Go ahead and use If-Match, 412, etc. (...or the equivalents in whatever protocol makes sense, of course)
Such an immediate jump to a massive-scale solution for a few to at most a dozen hosts talking amongst themselves is a classic YAGNI.
You are equating the reliability of a server machine with the reliability of a user's client machine, when in reality these are not remotely similar. Laptops are regularly suspended and move between networks, losing connectivity. Phones even more so. Consumer internet is unreliable, usually with poor outgoing bandwidth. Mobile networks even more so. Etc.
When I share a document with five coworkers then close my laptop, the others need to be able to keep working. Under your model, that means they'd need to choose a new server. Guess what? This means you now need a consensus protocol. One where partitions happen 1000x more often, it is regularly necessary to operate without a quorum, latency is high and totally unpredictable, and all kinds of other properties that make the problem exponentially harder.
-Nowhere was it insisted that all servers are equal.
-Most people would prefer a hosted solution to hosting one locally(not constantly using their local machines resources)
-For many apps the performance and uptime of a server is of enough importance that this simply wouldn't work.
-A hosted solution provides a stable controlled and relatively unchanging environment, depending on what else you use that box for.
-Unless there's guaranteed uptime(which there isn't in this case) you have even more merging headaches. (read 4 slave-peers(not really even 'peers' in this context) all have different diffs after a week of the master being offline)
-Putting an app on a box somewhere isn't a 'massive scale solution'.
>If you want to have a reliable server instead of just a server, then you want more than one server.
No. Not for a personal or small app you don't. You can have 'reliable hosting' (say one of the many VM providers backed by solid hardware and a RAID tolerant to multiple drive failures). Reliable doesn't mean completely immune to failure, and you're underestimating the reliability if you think this is a regular occurrence. Your average hosting provider offers orders of magnitude more reliability than any users local PC or mobile(even if we're talking a constantly online desktop with wired ethernet).
It's just a ridiculously complicated thing your asking for here while simultaneously dismissing the parent comment as 'hiding the problem behind an abstraction'.
Hahaha thanks everyone there's lots here for me to consider. The peer model is probably better suited to ephemeral stuff like gaming than to document collaboration.
What I'm wondering now is, where is sandstorm going to run? If it's on AWS like everything else, I guess I misunderstood the point...
Sandstorm runs wherever you want it to. You can run it on your own physical machine -- even your desktop, if it's always-on. A lot of people run Sandstorm on Digital Ocean. The company also provides Sandstorm Oasis (oasis.sandstorm.io) which is provided as a service.
Wherever you run Sandstorm (including Oasis), you as a user can upload and install any package, including ones you built yourself. Thus it accomplishes the goals in the blog post, even if physically the servers are "centralized".
Your talking by phone is "free of charge", you pay for air time, and buy a phone.
Equally, running a personal cloud can be free of charge, provided that you pay for connectivity and processing power.
Perhaps I was unclear. By "online services" I meant third parties that provide processing power and half of the connectivity. That is, they run servers. With a peer model, each collaborator pays for a fraction of connectivity and processing, but we wouldn't say any one of them is a service provider.
I'm in the process of trying to setup some things I'd like to seriously try in Docker containers (GNU Social and iodine for example). Now I want to try to get Sandstorm working in a container now as well. :-P
Honestly though I was talking about this exact concept that Sandstorm has implemented, a few years back. I was telling a buddy, "We have general purpose desktop OSes where we install apps. Why isn't there a building-block type server OS we do the same thing?"
Of course I'm sure like a million other programmers had that same idea, and it's am incredibly difficult one. Sandstorm seems to have a C++ base with Javascript frameworks on top of it. I'll totally put it on my list of things to try out.
> Why isn't there a building-block type server OS we do the same thing?
Because everything is built on Unix as an OS abstraction. Even Windows (since the early 2000s).
Consider: byte streams and file systems are less than ideal fundamental abstractions in a distributed world.
IMO we won't have what you're seeking until we rethink what an OS should be in an Internet-everywhere world. We're still using OS designs that predate the Internet entirely.
It would be nice to see a future where all applications have some concept of peer-to-peer networking and would be able to talk to each other. My hope is that this leads to blurring the line between having separate architectures for server, desktop and mobile apps, to the point where the only differences are reflected in the physical limitations of the device.
It's interesting to see how modern public cloud businesses seem to have borrowed a lot of their business models from old timesharing systems of the 70s. From there it's easy to analogize timesharing systems being killed off by personal computers to cloud computing being killed off by personal cloud.
I think decentralization is also about Users own/control their data. If someone create for eg. running app and I'm not able to easy move my old runs to new app then diversity is nothing.
"Software must be provided as a package – not as a service – with each user running their own private copy."
This sounds like a going back to desktop apps. Or am I missing something?
> "Software must be provided as a package – not as a service – with each user running their own private copy." This sounds like a going back to desktop apps. Or am I missing something?
Yep, you're missing something: sandstorm is trying to be something like an open source app store for servers.
I believe sandstorm can be run on anyone's server (or VPS). Then, you can manage through its web interface (one-click install) various sandstorm-compatible apps onto that server.
Since you still control the server, it's your data, but because it's a server it can be accessed from your desktop computer, or phone, or whatever. I think there's an easy backup / transfer solution if, say, you want to move your server from Linode to DigitalOcean or to your own hardware.
edit: to clarify how this relates to the post:
I'm a web developer.
In my free time I'd love to throw together a little free budgeting web app, but I have a distribution problem: Either I centralize it, in which case I have to deal with data security, authentication, and scaling, or I release it as a rails app and ask my users to go through that rigamarole of installing and running their own rails server.
Instead, with sandstorm, I package up my rails app as a sandstorm app and put it in their app store, and anyone who wants a budgeting app can easily install and run it on their own private server.
> Either I centralize it, in which case I have to deal with data security [...] Instead, with sandstorm, I package up my rails app as a sandstorm app and put it in their app store, and anyone who wants a budgeting app can easily install and run it on their own private server.
you just made a case for NOT using a self hosted opensource app
no, he hasn't. If he hosts the app on a central server he gets to keep data that needs to be reachable via the internet and may be of varying, to him unknown importance. He's a single developer with no ops team.
Anybody using it may either have an organization that can support hosting or may be in a position to host it on a secure, internal network or may have data that is of low importance anyways. The user of the app is in a much better position to assert the value and evaluate the damage of a breach/loss.
I may have read the comment I was quoting wrong but what I read was: if I, as a developer, create this software and centralize it/sell it as a service, I will have to deal with making sure data is secure (otherwise I get sued), while if I distribute with sandstorm, no need to care about securing the data, it's on the user!
I hope we can agree that having software available outside a private network is something of value, so I really hope that if I use sandstorm the apps contained are not designed to rely on being deployed on an internal network...
Note: I extended my previous comment's quoted text for better context on that reply
Meanwhile Sandstorm itself is designed to manage its own security, e.g. by automatically updating, relying on hard-to-do-wrong authentication mechanisms (i.e. not passwords), etc., so that users running their own server do not need to know about server security.
Before the web, software publishing was inconvenient, both for users and developers: copying listings from magazines, mailing floppy discs & CDs, finding that there is a bad sector on disk #52 of your Linux distribution...
But once you had the software, you could use it on your local data, fully under your control.
The Web changed that landscape dramatically: it is arguably the platform with the lowest barrier to entry for all parties. Unfortunately this has been abused by huge siloed services. They are attractive because they are free, and ... actually useful, but they make you lose control over your data, how you can process and manage it.
Sandstorm aims at fixing that imbalance, putting back the user in control, as do a few other "easy self hosting" systems.
> I think decentralization is also about Users own/control their data.
Of course. I'm arguing, though, that while user control of data is important, software and developer diversity is more important. I realize this is likely to be a controversial claim. :)
> [If] I'm not able to easy move my old runs to new app then diversity is nothing.
Depends. The fact that I have many (old) documents in Google Docs doesn't prevent me from using Etherpad today, even though there's currently no automated migration. Also, quite a few services do actually provide data export and API access that could be used for export, which all suggests that data mobility is not the real problem here.
Don't get me wrong: I obviously think data mobility is important. But I don't think approaches like solid.mit.edu -- which seems to focus entirely on data mobility and storage rather than compute -- will actually fix very much if there's still a very high barrier to entry to software delivery.
Incidentally, Sandstorm's fine-grained containerization approach (https://sandstorm.io/how-it-works) makes it easy to use a new app for new data while keeping your existing app for old data, while still getting a unified experience. You can have repositories in Gitweb and Gitlab side-by-side on the same server, for example. And, of course, moving data between physical servers is completely independent of moving data between apps on Sandstorm. So you end up with a situation where you can move between apps and servers without really needing a way to migrate data, which I think is, technically speaking, more realistic that expecting everyone to use standardized data formats or write good export/import code.
> "Software must be provided as a package – not as a service – with each user running their own private copy." This sounds like a going back to desktop apps. Or am I missing something?
The next line is: "It doesn't really matter if the user chooses to deploy to 'the cloud' or to their own machine, as long as they can run any package they want."
I should also point out that sandstorm only gives apps a single mounted directory that is writable and every "grain" (running instance of an app) has a download button that includes that mount point, so while sandstorm doesn't dictate data formats, you can get your data out and likely migrate it to another app.
Decentralization is the wave of the future, for the next 10 years. We have reached peak centralization on the internet (the most centralized being WeChat).
The Web is the most widespread decentralized user-facing platform, followed by email and then perhaps bitcoin and git. We should build on top of them.
There are two possible security models. One is sandstorm's - where everyone installs apps on their own cloud. You still need to get a hosting provider, but then it's a one-click affair. Like on DigitalOcean.
Or you can have apps use the browser security model to communicate across domains, with each app running on its own subdomain. All the powerboxes and other interaction between apps would happen through a user agent session under the control of a user, or through oAuth tokens that the user issues for apps to communicate behind the scenes.
There are pros and cons to each approach. The second approach allows the apps to be hosted anywhere, whereas the first one lets apps share capabilities.
But at the end of the day, you can have organizations host collaborative apps for members, and embed widgets from apps in other apps, which is really cool.
A key advantage, I think, of the first approach (Sandstorm's, in theory) is in the capability of apps to run 24/7 and do useful stuff in the background, non-interactively, when the user is offline. Like receive network messages, pull rss feeds, crawl the web on your behalf, or make your idle server's computing power available to your friends.
I think it's also about permission-free innovation, and maintaining a software and Internet ecosystem that allows that.
If we allow the Internet to become too centralized, ISPs and major cloud providers will have a large incentive to join forces or even consolidate. If you follow this trend the end-game could be an Internet that's like iOS: only whitelisted traffic is allowed, every link and protocol has to be approved, etc.
Then all innovation will stop because only huge players will be able to do anything new.
* Newer peer-to-peer file sharing apps never had a chance. Napster, Gnutella and so forth, under attack by the RIAA/MPAA and government.
The trend is toward more centralization, not centralization. With a last-mile duopoly, antagonistic government and MPAA and RIAA beating war drums, decentralization has been under attack for years and the centralization forces have been winning. I see nothing in the visible horizon that sees things changing any time soon.
> The trend is toward more centralization, not centralization.
What are you basing that on? When I think about journalism, that seems a lot less centralized to me. Bitcoin is decentralized money. Recording companies are giving way to smaller more specialized independent production companies.
I think software is in a weird recentralization due to this push towards services and away from products. But it's exactly the kind of technology cited above, and stuff like Ethereum, which will allow that trend to go back in line with the macro decentralization trend.
Maybe you can help me see what you're seeing that I'm missing?
Napster was centralized, that's why it was taken down. Now we have torrents. When a torrent site is taken down, 10 new ones replace it.
I think with projects like Ethereum, IPFS and others, there will be a huge push for decentralization in the near future. Of course it will be limited to enthusiasts at first, but not forever.
"Diversity" is quite an overloaded term nowadays and I'm hearing it more and more often in everyday vocabulary.
I wish I had some quantitative data on this, but it seems to be used in every day contexts now, from engineering to research (diversity of data, samples), and of course, social justice contexts (non-white, non-male, etc).
I wonder if the increased prevalance of the word diversity is a result of co-opting definitions from the association that "diversity is cool"
I am a bit confused by what you are wondering; all of those examples you give are using diversity in the same way, to mean having a wide variety.
Why diversity is a good thing might vary from case to case, but it is almost universally a good thing.
In this case, I think the article author is arguing that the diversity of app distribution channels is good for the same reason biological diversity is good; if something happens to make one distribution channel go bad, there are lots of others that can take over the role.
It seems that as far as marketing goes. "Diversity" is the new "green".
Both terms obviously describe a good thing but people seem to be willing to suspend their disbelief whenever they encounter the term and laud something just for such claims (whether they are backed up by actual numbers or not).
I wonder if we'll see green-washing style "diversity-washing" too.
> a result of co-opting definitions from the association that "diversity is cool"
If you think that "diversity is cool" is just a given then I have to assume you are not from the USA, or have otherwise been insulated from the American culture war. (Lucky duck!)
In the US, the word "diversity" instantly alienates tens of millions of people.
To me this is the aspect that separates Sandstorm from the rest: it is not a suite of apps; it is a developer platform. This is less sexy in the short term but it has potential to be more powerful eventually.
If we could come up with data that could be used by multiple apps, the thing would prompt everybody to decentralization.
However, every data we created is always highly coupled with the app in which it was created.
There's that effort called Solid, from Tim Berners-Lee, that is trying to change this, but they are so obtuse about ultra-complex standards that no one is using... maybe they're right.
Solid is a noble effort but I am skeptical that it would solve the problem, for two reasons:
1. Application functionality is often deeply coupled to its data format. When building on a standardized data format, how does one add new, novel features? If you extend the standard, then you now have a non-standard data format, and the features you added won't be recognized by other apps that use the format. No one wants to wait for the standardization process to complete before they can ship a feature, so inevitably apps will ship various incompatible extensions. Then, when you try to move your data between apps, you find that a bunch of stuff breaks. There's no clear solution to this, other than for everyone to stop innovating, which obviously isn't what we want.
2. How does having standardized data formats suddenly prompt decentralization? Yes, it means you can more easily try out competing apps, but those competitors still face the same high barriers to entry. VCs don't like to fund the second player in a market, much less the tenth. Open source and hobby projects aren't suddenly able to compete when they still lack resources to run a service.
This blog post actually started out being specifically a response to Solid, arguing that decentralizing storage alone (without compute) is not enough, but I ended up not feeling great about targeting them so I cut that part out...
I, as a programmer, am almost always able to migrate my data from an app to another, by using APIs, the command line and tiny bugged scripts that fetch data from one service, reformat it, repackage it and save it to another service, which uses totally different formats.
Maybe a better solution would be to create easier ways to do that, so everyone will be able to move their data.
And in practice, the small, new entrants to a market will write migration tools to migrate away from the entrenched players as a way to get customers.
If this isn't happening, then there's apparently no will from either player to write migration tools, which suggests to me that there would similarly be no will to use standardized data formats, unfortunately.
> However, every data we created is always highly coupled with the app in which it was created.
The emails that I write are not coupled with any particular app.
The HTML files that I write are not coupled with any particular app.
The ogg vorbis files that I produce are not coupled with any particular app.
The vCard files that [...]
Really, that's always been a strategy of businesses trying to lock in their customers. But open formats with various interoperable implementations do exist and work just fine.
> Or can one random person, working in their spare time, build just the right app and reach millions of people?
I agree that it's very difficult for one person; however, to win doesn't always need to be defined as becoming the most popular, or reaching millions of people. What if it's 1,000 people paying $10/month? That is possible for one person going the service-oriented route.
This isn't a nitpick about Sandstorm per se — I do love what they're doing — and I understand they're making this argument because what they do isn't a service. But I do think it's possible to find diversity/decentralization in small, independent service-oriented companies that don't care about being popular on the scale of Facebook or Google.
> What if it's 1,000 people paying $10/month? That is possible for one person going the service-oriented route.
It may be possible for a single person to pull this off today, but it's quite a lot of work and responsibility, and it's hard to tell if it will pay off until you've put in a lot of upfront investment. Someone who writes a useful application for themselves on a weekend isn't likely to want to engage in this. But if all they have to do is upload the package they already built to an app store, that's a much easier pill to swallow.
Conversely, as a user, I am not very likely to want to put my data into a service run by one person, especially if the data is sensitive or if I don't want it to suddenly disappear. But if I can run the app on Sandstorm where it is automatically sandboxed and firewalled, with the data staying on my own server, and the app can never disappear, I'm much more comfortable.
No offense to sandstorm but most of the apps indeed look like they were done in weekend. The blog author grossly underestimates what is required to make a great functional app (even those mobile apps that were done on a weekend aren't popular by any means). If this is the goal, then they should reposition themselves as "OS for distributing your weekend project" (no snark, am serious about this). It's a great niche if that is their initial target.
> and I understand they're making this argument because what they do isn't a service
That's true. I should have worded it differently. They're trying to attract developers to build apps for their platform, rather than launching them as a service (i.e., SaaS).
"Software must be provided as a package – not as a service – with each user running their own private copy."
My ideology resonates with this statement.
On a more rational level, I agree that an environment that promotes application diversity encourages competition and innovation and ultimately will keep the tech economy humming.
I can see that removing the infrastructure requirement for distributing and providing web applications opens up web application development to a larger group.
There are shades of similarity of this idea with copyright reform and the free culture ideas of Lawrence Lessig. Once IP hits a certain threshold of common knowledge, preventing others from freely using it inhibits economic growth. Once something becomes cultural, it's hard for innovation to not grow out of it.
An example is Pokemon: independent companies can't make Pokemon products without gatekeeping from Nintendo. Currently, If a third-party pokemon product benefits consumers and drives the economy but is specifically bad for Nintendo then it won't be allowed. The counter argument is to innovate somewhere else but that's ignoring the fact that innovation is born out of experience and if I had no control over experiencing Pokemon regularly in my life because it's a cultural phenomenon, my ability to innovate is essentially handicapped.
From a more philosophical point of view; the idea that everyone should become a programmer is honestly just plain wrong. I like what these people are doing but I don't think this could/should be done for more serious projects considering the amount of hacking that's going on at more centralized services like google-drive and dropbox. But if that's not their aim then that's all good I guess.
About shared data, can't you just use google spreadsheets?
> the idea that everyone should become a programmer is honestly just plain wrong.
I'm not saying that everyone should be a programmer. I'm saying that everyone who chooses to be a programmer and who produces useful code should have the ability to share their application with other people -- who may not be programmers.
Think of it like YouTube. YouTube doesn't replace film studios. But there's a whole lot of content on YouTube that you probably like, but that would never be produced by a film studio.
> the idea that everyone should become a programmer is honestly just plain wrong
For this to be fruitful, we need to define terms.
I agree that the notion that "everyone" should have the skill set of a full-time software engineer is silly. But I don't think that's what people mean when they say that.
The idea that "everyone" should be capable of creating software automation, I think, is a good one. By that I am thinking of things like spreadsheets, IFTTT, the late Yahoo Pipes, maybe some very basic scripting. I don't see this as substantially different than asking that "everybody" learn multiplication tables, basic algebra or how to balance a checkbook.
Or put another way, it is possible to function without basic algebra or trivially basic accounting knowledge, but extremely limiting. And I think that the ability to automate routine, simple things you want your computer/phone/cloud-enabled toaster-bunion remover to do on your behalf is becoming a similar basic life skill.
specialization of labor. different people are good at different things. some people have great people skills, but are poor at math/logic. they might make great salespeople, but poor engineers. why not use everyones' strengths?
"when the only tool you have is a hammer, every problem looks like a nail" I remember my creative thinking class, we were asked to brainstorm a solution to clearing the sidewalks of snow. I proposed a snow shoveling robot. the mechanical engineers had ideas like magnifying glasses above sidewalks to heat them, or making the sidewalks black so they absorb more heat. those were probably more elegant/cheaper solutions. society benefits from different viewpoints.
I think that argument at best supports the claim that there might be some things that not (almost) everyone should know/should be able to do, but certainly not that programming specifically is amongst those. Your argument would equally apply to arithmetic or reading and writing. Would you say that we should drop those from the school curriculum as well? If not, what distinguishes programming from those that it should not be a skill that's taught to everyone?
sorry for the late reply, i somehow didnt see this.
"I think that argument at best supports the claim that there might be some things that not (almost) everyone should know/should be able to do, but certainly not that programming specifically is amongst those"
true, good point.
"If not, what distinguishes programming from those that it should not be a skill that's taught to everyone?"
reading/writing are useful as they are important for conveying information. world news, stop signs, history, etc. Programming is useful, but I wouldnt call it a fundamental the same way reading, writing, math and history are. when people cant read or write, they can cause traffic accidents. They can vote foolishly(if they can find the polling place). People who cant do math will not understand their personal finances. People who dont know history will repeat it. I dont see society falling apart because people can't make their own apps.
Now for what its worth, I think it might be worthwhile to teach people some basic shell or python scripting, as that could make their jobs more pleasant by reducing the repetitive, boring work. But would we add a year of mandatory schooling? or would we cut something else? Because I do think those other subjects are necessary.
I just dont see why everyone should be a programmer, and I dont think I've ever heard a good reason, other than asserting that it's the future, and there is no other way.
> I dont see society falling apart because people can't make their own apps.
But wouldn't people have made the same claim before literacy and math skills were (almost) universal? And wouldn't they actually have been right? In the world as it was, you obviously could survive without those skills. The question is: (1) how useful will it be in the future, and, even more importantly (2) which opportunities could humanity be missing if programming doesn't become a universal skill?
> Now for what its worth, I think it might be worthwhile to teach people some basic shell or python scripting, as that could make their jobs more pleasant by reducing the repetitive, boring work. But would we add a year of mandatory schooling? or would we cut something else? Because I do think those other subjects are necessary.
Well, I think another aspect is much more important: People should learn enough programming to be able to understand how the world works. Certainly, the goal should not be to enable everyone to build highly-reliable highly-scalable high-performance distributed systems, or whatever. Just as the goal of language and math classes is not to produce authors and mathematicians.
Now, I don't think that being able to create your own "Apps" would be out of the question. The only reason that's somewhat difficult nowadays is because the platforms and SDKs suck, not that it's somehow fundamentally a complex problem.
But overall, I think the goal should be to (1) give people some idea of what happens with their data behind the touch screen, so they can understand the power structures that result from the software's inner working (because they otherwise will repeat history because they don't notice how those new-fangled systems encode power structures that previously were determined to be dangerous for a society), and make informed decisions on how to use information technology and (2) give them some idea of how to approach problems with mechanical solutions and possibly encode these solutions as software. With the latter, the point isn't even necessarily that they should be able to implement it themselves (though that certainly should be an option for sufficiently easy tasks), but that they have an idea that software could even be a solution, maybe how they could explain their problem to someone more specialized.
In any case, the idea would not be to teach the details of, say, the Android API. The idea would be to teach general principles of automatic information processing.
As for other subjects to throw out? Sure, throw out woodworking and electronics classes (where that hasn't happened yet), but replace them with computer science/programming rather than with Microsoft product endorsement classes.
"But wouldn't people have made the same claim before literacy and math skills were (almost) universal? And wouldn't they actually have been right? "
Good point.
"People should learn enough programming to be able to understand how the world works."
Ah, I was responding to the assertion that every should be a programmer. I might have misinterpreted this to mean professionally rather than passing knowledge.
"But overall, I think the goal should be to ..."
I agree, and I do get frustrated when people are surprised that their cloud storage files are on someone elses hard drive, or other similar situations. They really need to understand the implications of their decisions. And it would be good to give them another way to look at problems, another tool to use.
"As for other subjects to throw out? Sure, throw out woodworking and electronics classes ... Microsoft product endorsement classes"
Hmmm. Not sure I disagree, but woodworking is a valuable skill, and I think its important for people to be exposed to different types of careers so they can make an informed choice. My dad does woodworking for a hobby, and I would rather have nice handcrafted furniture available, rather than cheap plywood crap. I suppose they could learn to program cnc routers, and make beautiful machine crafted furniture. Not sure what you mean by electronics classes? As far as the office classes, there are people who really struggle with computers, and teaching them skills they will need in an office is useful. Maybe teaching them some basic ui design principles would do more to enable them to learn these programs, and others?
Well, you were lucky I saw your message at all :-) (you did upvote mine, didn't you?)
> Ah, I was responding to the assertion that every should be a programmer. I might have misinterpreted this to mean professionally rather than passing knowledge.
Well, obviously I am only speaking for myself, but I think most people who suggest programming to be tought as a universal skill don't expect everyone to become a professional programmer (though I guess chances are programming in some form or another could become pretty pervasive at least in jobs that typically require some kind of higher education).
> Hmmm. Not sure I disagree, but woodworking is a valuable skill, and I think its important for people to be exposed to different types of careers so they can make an informed choice. My dad does woodworking for a hobby, and I would rather have nice handcrafted furniture available, rather than cheap plywood crap.
Sure, but I think it's far less important as a basic skill nowadays, and it's not really much of a job anymore either. So, if resources are limited, I think teaching computing/programming basics would be more important. Nothing wrong with offering optional courses/clubs/whatever, though, I guess.
> Not sure what you mean by electronics classes?
Like, soldering simple circuits, that kind of stuff.
> As far as the office classes, there are people who really struggle with computers, and teaching them skills they will need in an office is useful. Maybe teaching them some basic ui design principles would do more to enable them to learn these programs, and others?
I think so. And in any case, I think that teaching a proprietary product in public schools is completely inacceptable, pretty much no matter how useful a skill it currently might be. Schools always should teach generic, transferable skills, not products. It might well be that Casio calculators are useful. But that doesn't mean that we have Casio classes; we have math classes that teach arithmetic, and possibly the use of calculators.
I did upvote you, i thought you contributed thoughtful responses :)
"but I think most people who suggest programming to be tought as a universal skill don't expect everyone to become a professional programmer"
wow, that moment when you realize it is possible that you have misunderstood an entire ongoing, industry-wide conversation.
"but I think it's far less important as a basic skill nowadays"
That's true, the average person does not need to make their own chairs anymore. And programming would be more useful to more people. But I sometimes like to think of high school as sampling things that you might be interested in pursuing further. Though I think we could get rid of one shop elective for a mandatory basics of computer programming type course.
Also, it is still a job, projected job growth 2014-2024 is 6%, which is average. [1]
"Like, soldering simple circuits, that kind of stuff"
Oh cool, we didnt have that. We did have metalworking classes, but I went with the computer ones. We had some cool stuff in middle school, like using radio broadcasting equipment.
"I think so. And in any case, I think that teaching a proprietary product in public schools is completely inacceptable, pretty much no matter how useful a skill it currently might be. Schools always should teach generic, transferable skills, not products."
I see your position, but I have seen so many people who think of using a program in a step by step manner who would need this. Perhaps a basics of computers/programming type course would fix the "navigate the program like a maze" mindset? Perhaps we are experiencing a filter bubble like effect where we have lost touch with people who are not into software, and how they do not just grasp things like we do?
> I did upvote you, i thought you contributed thoughtful responses :)
More importantly, that unexpected upvote was what made me look for possible late replies ;-)
> But I sometimes like to think of high school as sampling things that you might be interested in pursuing further.
Yeah, sure, but then, there is so much that you potentially could cover, and only so much time to fit it in, you have to somehow select those few that do get offered.
> Also, it is still a job, projected job growth 2014-2024 is 6%, which is average. [1]
Well, I would argue that woodworking class is much closer to what cabinetmakers (used to) do than to a modern-day carpenter's job, and that really isn't much of a job anymore. I mean, people don't only need to build their own chairs anymore, they don't need to build any chairs anymore, except for some for artistic reasons maybe.
> Perhaps we are experiencing a filter bubble like effect where we have lost touch with people who are not into software, and how they do not just grasp things like we do?
Well, it's difficult to say, but I suspect that the way it tends to be taught is part of the problem. If you aren't ever shown the map, it's kinda hard to figure out the overall structure from being led through the maze. And from my experience, those people teaching this stuff often haven't ever seen the map either.
Not all programming is about math/logic. Programming is as diverse as writing. Saying only one kind of person should program is like saying only one kind of person should write.
I'm not saying they cant program, just that not everyone should be a professional programmer. Not trying to exclude people here, anyone who wants to can. But I dont think everyone needs to be a programmer.
One person can be enough to create an outstanding product.
There are stories of solo developers creating awesome products on their own, but there are far more stories of solo developers failing to ever come close to a finished product at all. Betting on a solo developer is high-risk, high-reward. Even if the developer is unreasonably talented in every aspect (i.e. great programmer, great designer, great marketer, etc) it can still be a matter of how far they can go without exhausting themself.
A larger team of less skilled developers OTOH may still be good enough to deliver a finished product. Plus, even if each developer is less skilled at most aspects individually, their strengths can add up and easily surpass that of an extraordinarily talented individual.
The reason you hear about excellent solo devs winning all the time is that you don't hear about the millions of solo devs failing. It's plain old survivor bias.
Teams may hold back extreme outliers but they empower everybody else and drastically increase the chance of actually delivering.
I'm talking about "products" in the general sense. Software is a product. Even when developing an open source library there's more to it than just writing code.
If you mean: "How does Sandstorm the company ensure that it can't read Sandstorm users' data?", the answer is that we provide software that you can run on your own machine. The software is open source, so you can verify that it does not give us any ability to access your data.
"An OS for the cloud" is actually fairly descriptive. Sandstorm is both a set of
It's not politically charged at all. It was about technological diversity. Did you even read the article?
APIs and a distro for serving webapps based on those APIs. If you use Sandstorm, they claim your webapp will be easier to distribute, serve, and create. All of this was said clearly on the site's main page.
You have a valid point so I upvoted you. But I think you are getting downvoted for your tone (this is HN). Try to be more positive and give concrete actionable feedback (if you have any).
I am aware of the fact that I've pushed the boat out on criticism, but this, unfortunately is a hard-assed market. Personally I think you have been too moonshot and not specific enough. I may be wrong but hopefully my views give you a flavour of how bad-ass this unforgiving market can be. In the end, I believe that (possibly rude) criticism is part of the game. It may turn out that I may be the bozo here. But I stick by my opinion of your post.
This particular post was not designed to attract investors nor customers. It was simply some thoughts I had written late one night that I decided to post. The feedback has been overwhelmingly positive, your opinion notwithstanding.
My hope is that Sandstorm rejuvenates open source and indie web app development by providing a channel where end users can easily and safely run random web apps they find.