Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Talks that changed the way I think about programming (opowell.com)
593 points by greywolve on Dec 8, 2016 | hide | past | favorite | 95 comments


Alan Kay's intro quote is from this interview to Dr.Dobb's [0]. Here's some more context to that quote:

"Binstock: Are you still programming?

Kay: I was never a great programmer. That's what got me into making more powerful programming languages. I do two kinds of programming. I do what you could call metaprogramming, and programming as children from the age of 9 to 13 or 14 would do. I spend a lot of time thinking about what children at those developmental levels can actually be powerful at, and what's the tradeoff between…Education is a double-edged sword. You have to start where people are, but if you stay there, you're not educating.

Extracting patterns from today's programming practices ennobles them in a way they don't deserve The most disastrous thing about programming — to pick one of the 10 most disastrous things about programming — there's a very popular movement based on pattern languages. When Christopher Alexander first did that in architecture, he was looking at 2,000 years of ways that humans have made themselves comfortable. So there was actually something to it, because he was dealing with a genome that hasn't changed that much. I think he got a few hundred valuable patterns out of it. But the bug in trying to do that in computing is the assumption that we know anything at all about programming. So extracting patterns from today's programming practices ennobles them in a way they don't deserve. It actually gives them more cachet.

The best teacher I had in graduate school spent the whole semester destroying any beliefs we had about computing. He was a real iconoclast. He happened to be a genius, so we took it. At the end of the course, we were free because we didn't believe in anything. We had to learn everything, but then he destroyed it. He wanted us to understand what had been done, but he didn't want us to believe in it.

Binstock: Who was that?

Kay: That was Bob Barton, who was the designer of the Burroughs B5000. He's at the top of my list of people who should have received a Turing Award but didn't. The award is given by the Association for Computing Machinery (ACM), so that is ridiculous, but it represents the academic bias and software bias that the ACM has developed. It wasn't always that way. Barton was probably the number-one person who was alive who deserved it. He died last year, so it's not going to happen unless they go to posthumous awards.

It's like the problem Christian religions have with how to get Socrates into heaven, right? You can't go to heaven unless you're baptized. If anyone deserves to go to heaven, it's Socrates, so this is a huge problem. Binstock: I don't think they do that.

Kay: They should. It's like the problem Christian religions have with how to get Socrates into heaven, right? You can't go to heaven unless you're baptized. If anyone deserves to go to heaven, it's Socrates, so this is a huge problem. But only the Mormons have solved this — and they did it. They proxy-baptized Socrates.

Binstock: I didn't realize that. One can only imagine how thankful Socrates must be.

Kay: I thought it was pretty clever. It solves a thorny problem that the other churches haven't touched in 2,000 years."

[0] http://www.drdobbs.com/cpp/interview-with-alan-kay/240003442


"Kay: I was never a great programmer."

I wonder how Kay would score on sites like HackerRank, and how many companies today would pass him over because of that.


I wonder, if you had some impossible deadlines and a ton of code you had to write just to survive to the next funding round, would you want Alan Kay on your team? I say this with enormous respect and admiration for Kay, but there are horses for courses and groundbreaking ideas are not the same as shipping product.


Alan Kay's work is not about how to ship the product within the next 2 months. But it can be helpful to listen to his talks in order to maintain a longer term view and to see fundamental limitations in the way we do things.


I suspect a lot of these video are the most powerful when you encounter them at exactly the time you happen to be wrestling with those same ideas yourself.

Every time I design software, I'm think back to Gary Bernhardt's "Boundaries" talk¹ and his practical, concrete suggestions for writing testable code. But I've never met anybody else who seemed as impressed as I was by the idea.

¹: https://www.destroyallsoftware.com/talks/boundaries


Yup, that talk is gold. He's started producing screencasts again, and live streaming some on twitch. https://m.twitch.tv/gary_bernhardt/videos/all


I consider that talk one of the most interesting about programming and I consider it underrated as well.


Thanks for posting this. I just finished watching it and thought it was a great talk with some really forward thinking on organizing functional designs.


Thx for this.


When I was consulting full time, I happened to run into this video at exactly the right time. I shed two toxic clients and a few weeks later negotiated the largest contract I ever had to that point (and honestly, since).

Mike Monteiro: "F* you, pay me" - https://www.youtube.com/watch?v=jVkLVRt6c1U


These talks are a little skewed to the game/high performance programming side of the art, but still very interesting in general. Of the ones I've already seen I really like Mike Acton's talk.

Since this has become such a nice thread some additions I'd add:

* Sandi Metz going through the Gilded Rose or "All the small things"

https://www.youtube.com/watch?v=8bZh5LMaSmE

I already subscribed to her programming style and the general Ruby TDD/BDD movement, but this talk captures all the important values in a single example. I think it made my programming style no longer based on vague things like experience or intuition, but just on concrete merit shown in this talk.

* Matthew Brecknell demonstrating Hole Driven Development

https://www.youtube.com/watch?v=52VsgyexS8Q

The programming style demonstrated in this video is a real mind bender. I think most Haskell programmers use a weaker version of this, Matthew takes it to the extreme. I didn't adapt this style, I don't think it's practical, but it's the sort of thing that some person someday will incorporate in some more comfortable way in a new language or platform as a revolutionary feature.


> These talks are a little skewed to the game/high performance programming side of the art, but still very interesting in general.

I get the chance to interact with a pretty wide range of software engineers, the thing that constantly blows people's minds is how caches work and the fact that there's 10-50x performance waiting for you if know about it(and have the time to exploit it).

It's almost like they don't teach it in school or something, I agree there's a lot of gamedev/perf stuff but that lines up with my experience of things that change the way people approach programming.


> that constantly blows people's minds is how caches work and the fact that there's 10-50x performance waiting for you if know about it

Could you point to some sources?


This is the paper people usually use to explain some of the performance benefits you could reap, besides the talk about row vs column based performance:

https://www.akkadia.org/drepper/cpumemory.pdf

Note that I wouldn't really recommend it unless you're really going to do low level programming for high performance. It's super long, explains all the details, if you've got a basic CS college education you should know most of it already anyway.

If you're just a web developer this won't actually help you as most of this improvements have already been done in the parts that matter (i.e. your database, your operating system, your interpreter and perhaps your application server).


+1


> I don't think it's practical, but it's the sort of thing that some person someday will incorporate in some more comfortable way in a new language or platform as a revolutionary feature.

As mentioned below, this can be used in languages like Agda and Idris; they also give you proof search, which tries to automatically fill in the holes from available definitions. This works well for 'proof objects' (the name given to values which only exist to satisfy the type checker), but requires caution for values with 'computational content' (those values which can effect the resulting computation).

Pretty much all Haskell code (except the really wacky astronautical stuff) has computational content, so there are fewer "obvious" obligations to fill in than would appear in a proof (in fact, due to laziness and lack of totality, we can safely use 'undefined' for all proof objects, since they'll never be pattern-matched!).

For example, if our function needs to return a list (e.g. if we're writing a function like map, filter, iterate, replicate, cycle, etc.), a proof search will immediately give us the empty list '[]', which is correctly typed but probably wrong.

For Haskell, tools like djinn can get you a little hole-filling automation, and I think that Emacs modes like ghc-mod and intero support calling out to djinn (I can't test this, as I can't get either to work on NixOS :( ).

For a little more work, you could write properties for QuickCheck (/SmallCheck/LazySmallCheck/etc.) to constrain the behaviour, which would allow trivial solutions like the empty list to be ruled out automatically. If you're a TDD disciple, then you already wrote these properties, so this part would actually be free (as long as the automation tooling exists).

At that point you're basically doing inductive functional programming, so tools like IGOR2 or MagicHaskeller might be useful to plug in as well.


For some, there might be some valuables on these interviews:

https://www.youtube.com/watch?v=QVpSIdWE0do&list=PLEMXAbCVnm...

Again, it's game dev oriented, but there are good general advice as well, here and there.

Mike Acton is on there too. Jonathan Blow, Ron Gilbert, etc.

In general, I think more programmers should be concerned with learning how things work at a deeper level than the API for whatever web framework they are using. Web applications are god-awfully slow, it's like the programmers behind them just don't give a shit about performance, at all. It's not that much better on the desktop side either.


The "hole-driven" development style originated in dependently typed languages such as Agda and Idris, which I imagine Matthew's video was inspired by. So if you want to experience it right now, go learn Agda and use agda-mode in emacs. (Probably something similar exists for Idris. I don't know whether Coq has a notion of "holes".)


idris-mode for Emacs does similar tricks ( https://github.com/idris-hackers/idris-mode ) and I've heard it's very nice.

The Idris REPL itself can do hole-based programming too, which can then be dumped out to a file (:m to list holes, :p to prove a hole, :a to write proofs to disk) http://docs.idris-lang.org/en/latest/reference/repl.html


> The programming style demonstrated in this video is a real mind bender

You might also be interested in:

https://hackage.haskell.org/package/djinn

Djinn writes the code for you. Cut out the middle-man. (Don't show your boss.)


I'll just leave this here:

* Growing a Language, by Guy Steele

https://www.youtube.com/watch?v=_ahvzDzKdB0


I loved that talk!

I especially loved the moment when I finally understood the point he's trying to make is applied to the writing of the very talk he's giving.


I love this quote from Eskill's talk

"C++ is evil because it makes dumb people think they are clever."

Replace C++ with any "intelligent" framework or language.


Replace "evil" and "dumb" with "high and mighty, naive, hyperbole".


These two made an impression on me, especially regarding the delicate trade off between implicit and explicit.

Among other things, how and why to making things more implicit:

http://youtube.com/watch?v=wf-BqAjZb8M (Beyond PEP8 by Raymond Hettinger)

When implicit goes too far:

https://www.destroyallsoftware.com/talks/wat (wat by Gary Bernhardt)


I really like that my two favorites from Rich Hickey are present. If you don't watch any of these you should at least watch 'Simple made easy'.


I clicked the link just to see if I was correct in predicting that "Simple made easy" was on the list. It's on my list, and seems to be on everybody's list. It reminds me of that quote: "I didn't have time to write a short letter, so I wrote a long one instead." (Mark Twain)


Rich has great talks.

However, I am not sure if what the author wrote ("Your most powerful problem solver is your subconscious mind.") is a spot-on articulation of what Rich means by hammock driven development. Here's my summary of what Rich means: immersive, focused thinking followed by unstructured, relaxed, open-ended thinking. It is the combination of the two that is so powerful.


Relevant HN thread that is a goldmine: https://news.ycombinator.com/item?id=12637239


Garret Smith - Writing Quality Code in Erlang

https://www.youtube.com/watch?v=CQyt9Vlkbis


Fun to see Eskil (there's a typo but that's how you spell it) on here, that talk isn't very old.

It would be interesting with some more analysis from the OP, namely what was learned that changed their thinking, and how their thinking was changed.


This is a great idea, I'll definitely be doing some followup posts in future.


If you haven't seen Eskil's very old video "Developing the Technology Behind 'Love'" you definitely should. In it, he describes the tech design choices he made that made it feasible for him to develop an MMO as a solo project.

https://m.youtube.com/watch?v=f90R2taD1WQ


He mentioned a lie of:

Code should be designed around a model of the world

but I didn't hear any reason why not to? The Key/Value pair being the only reason, but besides that being a optimization / preoptimization in high performance applications, is there any reason to not design code around a model of the world?

Seems to me it makes things easier to think about.


I am trying to make a game that follows Mike Acton ideas.

Basically: When you try too hard to fit the real world into code, you end with OOP.

It has one great advantage: it is easy to translate real world into "computer".

but one great big distvantage: a computer is a computer, not real world, OOP translates to complexity (think "Architecture Astronauts", and spaghetti of pointers/references/virtual/inheritance) and things done in a way that harms performance (not a problem for smaller problems, but if your problem is not small...)

His idea is that you should instead fit the world into your DATA, not your code, think about what data your program needs from the real world (do you really need all tiny details for example?), what is your inputs and outputs, and THEN you code to make that work, you code around your data structures, file formats, etc... not the other way around.


Apologies, haven't watched the video, but there are some pitfalls when designing around a 'model of the world', especially in OO.

I think Wizards and Warriors by Eric Lippert is a perfect illustration of how our initial assumptions and models aren't often the right ones (although it's not so much about data in this case) https://ericlippert.com/2015/04/27/wizards-and-warriors-part...


Just read through those 5 posts - absolutely fantastic, thanks for the link.


I can't (won't) watch YouTube at work, but don't two of the lies contradict? If code is not more important than data, doesn't it make sense to model the code around that data? Or am I conflating data in lie #3 with "the world" in lie #2?


Well, you could argue that Acton instead argues that the data should be designed around a model of the world, and the code around that data. Although I concede that sounds like a nitpick.


If i'm not misunderstanding you.. (as i didn't watch most of the videos there)

See data-oriented programming (not to be confused with data-driven). In the end code just processes data.


The following talk about event sourcing (not listed in the article)•, really made an impact on me:

Event Sourcing - Greg Young

https://www.youtube.com/watch?v=JHGkaShoyNs

•: Not that I really expected it to be I guess.


Overly complicated pointless eventsourcing wasted 1mil+ at my previous employer. Now they have to spend 6 months removing it all.


Care to elaborate? We're having quite good experiences with it at the moment, but would like to learn from others.


We used this mess of a library, which provides no clear interface and corrupts data https://github.com/johnbywater/eventsourcing


A faulty implementation doesn't necessarily invalidate the idea. I also strongly suggest looking into statically typed languages. Python is very poorly suited to working with data.


> Python is very poorly suited to working with data.

This does not seem to be true, based on widespread industry and scientific use of Python generating, managing, and analyzing data.

Perhaps you meant in some specific way, like "Python does not enforce objects' data schema" or some such?


Yes, but even so it's a huge undertaking to build your own event sourcing. Many get burnt.


I'll go out on a limb here and say it wasn't Event Sourcing that caused the problems.

Anything can be ^H^H^H^Hmessed up.


Anyone have a link to Alan Kay's talk, where he demos UIs and a compiler for UIs in ridiculously small # lines of code?

There are a few talks like this, but there is one in particular where he goes into a lot of detail about it. I can't for the life of me find it again.


Is it really "Complex"? Or did we just make it "Complicated"? : https://www.youtube.com/watch?v=ubaX1Smg6pY


Btw: The next iteration of that project live here: https://github.com/harc/ohm


It would be nice if you linked transcripts where available...


I know most of these. The common theme is emphasis on solving problems instead of getting caught up in religious wars over "the one true way".


I just watched the Blow talk. I had never seen him talk before and was really impressed by his perspective. I found myself agreeing with many of his points in a way that I never would have a couple years ago. I love how his main metric for optimization is developer time more than anything. I've learned how important this is in recent years.


This one by Bret Victor (or any of his other talks, really) also come to mind: https://vimeo.com/36579366


Yeah.

Firstly, maybe lastly, the conclusion of this video is real powerful stuff. I cannot pinpoint it's philosophical anchoring or origin. The same message was at the end of the iconic documentary about Jodorowskis Dune.

But

I have been trying to mentally operationalize the advice and ideas here, but found it really dificult and abstract.

Obviously an incredible lecture that deserves its own category.


Additional two on "software design and evolution" that were hugely influential for me in my career:

* Responsive design, by Kent Beck https://www.infoq.com/presentations/responsive-design

* The Grand Unified Theory, by Jim Weirich https://vimeo.com/10837903


> Iterating over a two dimensional array by row is much faster than by column.

Another fun fact is staggering array accesses is faster then linear acesses.


Can someone elaborate on this? I tried searching and wasn't having any luck. I always thought sequential linear access was fastest?


This is all nonsense. Accessing linear memory in a linear fashion is the way to go. You gain absolutely nothing by staggering memory accesses, because you’re waiting for memory to come into cache while the CPU idles, and since you have little control over how cache is utilised across a span of time, the whole concept of doing this to pre-load anything is just bogus, unless you’re doing specific optimisations for a specific architecture, even then there are cache pre-loading instructions for that.


This will technically only work on pipelined architectures of which all major CPU architectures are one. You will be able to see a significant boost to performance given the following are all true:

   * You have no inter dependancy of information across the lines
   * All your data is in the cache
   * You don't have instructions generated between the two lines that will require some sort of operand forwarding
Basically, if you already have the stuff in the cache, the calculation is invarent to the last calculation, and you have a processor capable of it you'll see a large speed improvement.

There is a great talk about this [0] (for the exact moment go to [1]). If you're interested further I'd very much recommend finding some sort of GOOD computer architecture class. Luckily now we have a fantastic resource to locate these such courses [2]. This seems like dark magic and many programmers refuse to come to terms with it's existence. Check out `dreta`'s comment to see an example.

Now I do agree with dreta, this is VERY much architecture dependant but most architectures would benifit from this sort of optimization. Pretty much all modern Intel and AMD CPU's caches are large enough, they support a big enough cache, and they also all have pipelines under the hood. Intel started this in the pentium series with a 5-stage-pipeline (which I find very funny due to the GHz wars that came from that time period desipte the pipeline being probably the pargest performance boon in most cases that you would want to brag about).

[0] - https://www.youtube.com/watch?v=e08kOj2kISU

[1] - https://youtu.be/e08kOj2kISU?t=26m50s

[2] - https://github.com/Developer-Y/cs-video-courses#computer-org...


I'm not that low-level programmer, but my guess would be loading to CPU cache a whole memory segment at once.


I wonder if this is language or even compiler dependent. Are there things that transpose memory organization?

I for one always get confused which index is the row and which is the column when coding. Every damn time I've done a 2D iteration in the last 15+ years. Is it [col][row] or [row][col] ...

I imagine it depends.


Indeed it does depend, some languages are row major (C, C++) while others are column major (Fortran, and some common HPC libraries are in Fortran).

https://en.wikipedia.org/wiki/Row-_and_column-major_order


I'm not alone!!!!!! :) I was having brain melting with today's http://adventofcode.com/2016/day/8 I'm attempting everything in Elixir and it's quite different from what I'm used to!


>Are there things that transpose memory organization?

Not exactly what you asked, but if you look at most subroutines in linear algebra libraries like LAPACK, they tend to have extra arguments so you can tell if your inputs are already transposed and/or conjugate in order to have more efficient memory access.


I gotta throw in one of my favorites, because I have not seen another that really speaks so directly to the value of craftsmanship in development. I also deeply enjoyed the allegory with woodworking tools.

https://www.youtube.com/watch?v=ShEez0JkOFw&t=13s


Are the start times on these YouTube links deliberate? Some of these link to the middle of the video, and I'm not sure why.


These are exactly the talks that had the most influence to the way I write software, too. Great list!


I always hugely enjoy these lists, thanks for sharing this one as well. Great material in there :-)


tried listening to why OOP is bad and it was painful. 10 minutes in and 0 insights. He kept saying things like "encapsulation doesn't work" and then moving on. Not willing to stay another 20 min to find out why.


There are many talks about why OOP is "bad" but they tend to come from the FP camp The decent talks of this nature are backed up by firm, fundamental reasoning showcasing the power of "better" abstractions. I, personally, find both OOP and FP compelling, depending on what I'm trying to model.

Wish I could link a few examples, sorry.


I vehemently disagree with you. He goes about in a very structured and objective style and you can also tell he has decades of experience to make informed statements. I think it's super interesting.

It might be not much news when you consider only the title statement, but it's the reasoning that talks are watched for, right?


Mixed feelings about the talk. Encapsulation and breaking encapsulation can be key. I started in networking and had the clean OSI/ISO stack pictures in my mind. Then looking at the management plane of a network stack TCP protocol, the transport layer details and a few other telco network stacks I started understanding the really interesting bits happen at the edges where the layers meet. Some applications cut across the layers.

He got some good points in his analysis with respect to fine granular OO design. The reasoning around the need to break encapsulation was illustrative. If objects are too small they will have a lot of surface area and their interconnection is hard to reason about. In OO programs control flow is not easily graspable from reading code. If the object graph is a tangled mess then the team will have an exciting time.

Pulling out the program flow into a large procedural piece of code can look attractive. In my experience such pieces of code incidentally are found in ..Service ..Manager classes the speaker is not so fond of. At times this may be a clearer and more effective approach than managing the object graph and distributing logic to messages. But going to procedures of several hundred lines is imho going to far.

When I think about it then software needs to be testable. If the objects are too trivial and a lot of logic is in their interconnection graph then the risk increases that the important bits are not tested - after all we got close to 100% coverage. But it is still possible to build the graph and test it in a meaningful way. Being OO I can instrument it too with test frameworks. That seems less the case with a longer piece of linear code.


There are three talks, actually. Here are the other two.

Object-Oriented Programming is Embarrassing: 4 Short Examples[0]

Object-Oriented Programming is Garbage: 3800 SLOC example[1]

[0]: https://www.youtube.com/watch?v=IRTfhkiAqPw [1]: https://www.youtube.com/watch?v=V6VP-2aIcSc


Watching his solution was the most painful part. His solution is to remove functions and put everything inline, and adding comments in the place of function names. To me, this sounds like a testing nightmare. It also sounds like a readability and maintainability nightmare.


Solution to what? I think you are missing the context of his statement here.

Inlining functionality to very large procedures has the advantage that it makes obvious that there are no moving parts. You look at a few lines and know exactly the execution context. This improves readability a lot.

Maybe just try it before calling it a nightmare. Most things in life have pros and cons, the difficulty is to decide what to do in which situations. Don't take anything too seriously or idealistically. Try to understand from the perspective of an honest presenter and apply your own judgement.


> Maybe just try it before calling it a nightmare.

I spent the first ~5 years of my career writing code in this fashion. I actually agreed with most of the talk, but this section was a big sticking point with me.

Inlining functionality to large procedures has the disadvantage that you need to understand the entire procedure to know what the function does. It also has the disadvantage that you need to test the permutation of every branch within that function in order to fully test the function. That's an order of magnitude more work. Don't even get me started on his stance on TDD (and how correlating it to the failure of OOP makes no sense at all).

And, yes, I understand that there can be pros. I've made my stance on the cons very clear, and I strongly feel as though they outweigh the pros. Does it make the moving parts more obvious? Probably sometimes. If you can't come up with good names, and your functions are small enough to begin with, probably. Better naming -- which the presenter addresses -- IMO does a much better job solving this though. His solution is to stop trying to name things, mine is to spend more time coming up with better names.


Thanks for clearing up. He did address the "understanding the entire procedure to know what the function does" part by light use of code comments.

Personally I'm not much invested in testing (most of my programming is recreational) but I fail to see how a big function should be harder to test than two smaller ones that when combined can do the same things. The two small ones have obviously more possible code paths since the smaller, driven function is not hidden in the driver function anymore, so can be called in ways that don't actually matter to the purpose of the program. (That's the authors point - he calls it "surface area").


> fail to see how a big function should be harder to test than two smaller ones that when combined can do the same things

It's understandable how you wouldn't see this without testing experience. Wrote a quick gist to try to explain it as best as I could. https://gist.github.com/lojack/5a8526e88c759acac3f4f46036a37...

That's a trivial example using pseudocode. But, basically, you'll see that in the longer function example you need 10 test cases to cover all grounds, while in the method thats split apart you need 8. Realistically, I could have further split parseArguments up to simplify those test cases. It seems like a small change, but if you were to add an additional branch to the first method (an if statement) then you'd double the number of tests required. For the longer method thats 10 additional test cases, for the shorter method thats 2.


Realize that both implementations offer the same functionality. You could even say the implementations are basically identical, except for the fact that one gives more "calling surface". So it is is only the "ideal" of 100% branch coverage that misleads you into think you would need more tests.

The two branches of the big function don't interact in any way (they could also be paralleized), so 100% branch coverage has no benefit here.

In fact you need less tests with the big function if you don't know the context in which the functionality is used, since the surface area is smaller.

Another way to look at it is that 100% branch coverage only means all the branches in one isolated function are tested. However, the multiple-functions version calls other functions and the possible branches there are ignored. In other words the interaction with called functions is not tested.


Guy Steele's, Sean Parent's C++ talks and Alex Stepanov's A9 course.


The theological second half of this is bizarre - I've never heard of this being considered a "problem", heaven (in the Christian tradition) is not a hall of fame that one nominates figures for. And the Mormon tradition of proxy-baptising people who are dead and never showed any sign of adherence to their tenets is highly questionable.


We detached this subthread from https://news.ycombinator.com/item?id=13129374 and marked it off-topic.


Mormon and long time lurker here. Just wanted to give clarity to this because it can be confusing. Mormons don't believe they are forcibly baptizing these people by proxy. They believe that everyone should have the opportunity to accept baptism and salvation through Jesus Christ, even if they did not receive the opportunity in the mortal stage of life: "Individuals can then choose to accept or reject what has been done in their behalf."

https://www.lds.org/topics/baptisms-for-the-dead?lang=eng


Thank you for clarifying. It seems like there is an unfair share of negative perceptions compared to other religions just because it's only ~200 years old.

Although I'm not a member, I lived in a 90% Mormon city for a few years and was made to feel very welcome and accepted.


Hi, sorry if this comes off as offensive, but is there a general consensus in mormonism about a gradual evolution of life on earth?


Sorry for the delayed response. As paulddraper there is no official doctrine on evolution, but Mormons do not believe in "Creatio ex nihilo" or that the creation of the earth was literally a seven day period. The "days" of the creation story are meant to mean periods of time, and we do not fully understand how God created the earth or that evolution was not necessarily a part of it. Mormons do not accept a lot of the "standard" Christian teachings on creation that actually stem from the Greek/Roman philosophical influence in early Christianity and not from scripture. Our official doctrine encourages us to seek to learn and understand the will and processes of God through science in addition to spiritual experiences.

"Latter-day Saints should strive to use both science and religion to extend knowledge and to build faith." "Is there any conflict between science and religion? There is no conflict in the mind of God, but often there is conflict in the minds of men." http://en.fairmormon.org/Mormonism_and_science/Are_they_comp...

More Sources: http://en.fairmormon.org/Mormonism_and_science/Evolution/Off... http://en.fairmormon.org/Mormon_view_of_the_creation


https://en.wikipedia.org/wiki/Mormon_views_on_evolution

tl;dr Some Mormon leaders say it's incompatible with scripture, though the LDS church hasn't never said anything official.


I'll take the negative votes as a no :-)


Did you just criticize a religion for not making sense?


> The weirdest thing about Mitt Romney is that crazy religion. Mormons believe Joseph Smith received gold plates from an angel on a hill, when everybody knows Moses got stone tablets from a burning bush on a mountain.

http://www.cc.com/video-clips/cw4el2/the-colbert-report-yahw...


Mormons consider baptism a gate - proxy baptism opens the gate, the post-mortal individual can choose to enter the path via the gate, or not.

This necessitates a lot of other background beliefs - immortal soul, individual will, and significant ordinances, to name a few.


In the particular case of Socrates, one might speculate that he had the privilege of affirming or denying his faith before Christ himself: http://biblehub.com/1_peter/3-19.htm


Thanks for sharing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: