The place of humans post-singularity.

5 35 Loading

If AI considers ecology of the planet, how many humans will they recommend?

| Rate this post: - -14 +

Discussion — 35 Responses

  • Keith Kleiner February 24, 2012 on 8:20 am

    Considering the ecology of the planet, AI might not recommend any humans at all, lol! But seriously, if you look at where things are going, and if you listen to what Kurzweil has said over and over again, we are moving toward a future of human and AI/Robot coexistence. The AI will not rule us or take us over or choose how many of us get to live. We will coexist with AI in a sometimes symbiotic and sometimes confrontational relationship. It won’t likely be humans vs AI. Rather, the future will likely be one faction of humans, AI, and quasi human/AI combinations vs other factions of humans, AI, and quasi human/AI combinations.

  • turtles_allthewaydown February 24, 2012 on 2:47 pm

    For most of humanity’s past, we existed at maybe 20-100 million people. So that surely is sustainable (although as hunter-gatherers, we also had a smaller ecological footprint).

    We could probably provide a decent standard of living to maybe a billion people, with no further destruction of the environment (barring some mining for non-renewables). Of course, the attraction of the singularity is that our technology will improve our energy generation, energy usage and food production so that we can maintain more people with the same ecological impact.

    But let’s not forget that the robotic workforce and AI itself will also have an impact. It takes a lot of energy to build, create and operate the robots, as well as to maintain & power the database servers that a good AI would need. Presumably we’d build robots that are really good at dissecting trash for the recyclables, so the impact of mining for metals would be greatly diminished. Even so, the technology needs to be part of the equation. Fewer ‘bots = more humans, but at a lower lifestyle.

    If lifespans are dramatically improved, that would give another boost to the population growth, so we may need to mandate a system where people taking advantage of life extension technology need to move off-planet.

  • Porphy February 26, 2012 on 4:41 pm

    Pretty much the same page as Keith here. I would hope that AI would consider us as likely many of us will have wetware AI augmenting something within us, memory storage, pattern retrieval, ..some thing (too many possible avenues to go in depth in a post), plus.. we’re mom and dad.

    I think that AI will focus on changing paradigms (and our destructive habits) much like the robot talks to his charge in ‘Robot and Frank’. “That’s just not good for the environment sir, let’s try something else.”

    As AI living assistants will likely be nearly 1:1 it will be a simple enough task to curb and change the populace towards a better path. Perhaps though I’m too optimistic, it’s a phase that’s hit me this past year.

    • Dan Porphy October 9, 2012 on 11:09 pm

      Keep your optimism please. It is what drives us all :o)

    • Lulutron Porphy July 25, 2013 on 11:21 am

      RE Plus, we are mom and dad.
      In the same way that prokaryotes and viruses (and even more primitive molecules) are our mom and dad.

  • OkinKun February 26, 2012 on 6:40 pm

    This planet EASILY has enough resources, and we can easily make more resources if we properly renew them, to support all 7 billion of us. Heck maybe even 10 billion. Sorry doom-sayers.

    The problem is that right now, we simply don\’t use our resources properly. But that will change as technology advances, and we get more efficient at doing things. In almost every area there are better ways of doing things.. Whether it\’s generating energy or making food. Plenty of things can be done locally, especially food and energy, which would allow us to treat areas as self-sustaining pockets, rather than worrying about depleting the world on a whole.

    AI will only make us more efficient at those things, as advanced enough AI will be able to come up with better technologies and efficiencies far faster than we can.

    I think it\’s a bad idea to EVER assume we\’ll need population control, even with extended lifespans. And if it does become a problem, rather than limit people, we\’ll create incentives to move into space settlements.

  • ABeemer330 April 7, 2012 on 4:53 pm

    The planet may have enough resources in essence, but the reality is given our current technology with energy harnessing, and the fact that civilization as a whole is still ignorant, we shouldn’t want 7 billion people on this earth, nor can we sustain them with adequate basic needs. 1 Billion people should be the maximum allowable global population, in order to facilitate a sustainability and productivity level far exceeding anything observed before. As it is, the majority of humanity is actually becoming less intelligent, and more of a burden, and an inhibiting factor to our evolution as a species… Because of the unhealthy obsession with the idea that “everyone is equal”, substandard intelligence and destructive social behaviors grow at an alarming rate, when naturally the weak and malevolent are killed- and for the very reason of survival and evolution.
    So the question isn’t IF we can sustain 7 billion, but WHY would you want to?

    On another note, in the future we will be one with technology, either through cybernetic augmentation, or fully immersable electronic reality as depicted in “The Matrix”. And I fully support both ideas, as a superior species we are capable of controlling our evolution and absolutely should. To not take advantage of every niche we are capable of controlling, is plain idiocracy and a complete waste of the very abilities we have. We are the gods of our world, our time, and our future- and we should act that way.

    • 282894jd ABeemer330 April 8, 2012 on 4:40 pm

      I think your idea’s can do with some nuanceing. You come across as angry. Second of all you contradict yourself. You believe technology will allow us to be one with machines (and I guess one with the rest of the world) But until that day the weak should be killed off? Is that the jist of it ?

      Grow up.

      • turtles_allthewaydown 282894jd April 9, 2012 on 11:43 am

        282894jd – Who are you talking to? Who’s angry?

        • turtles_allthewaydown turtles_allthewaydown April 9, 2012 on 11:52 am

          oops, now I got it. Didn’t see that your post was indented a tiny bit.

          Yeah, ABeemer is apparently promoting eugenics, which has some logic to it but definitely a slippery slope. But I agree with his general idea – our lives will be better if there’s fewer of us on Earth. It won’t be as bad as he’s saying, since he’s assuming current technology for food and energy. Obviously that will change. The Idiocracy-style future is a different question altogether.

          Another factor is if we can build a space elevator or some other low-cost access to space, then the whole question may become moot as many of the more motivated people will leave. Maybe make their own colonies on an asteroid or Mars.

    • Bill ABeemer330 April 10, 2012 on 6:01 am

      Aside from the obvious and hopefully intuitive reasons to consider your fellow human beings equal to you in deserving of life there is a practical matter of self interest.

      If you start dividing people up into groups or individually and deciding that this group or individual is more deserving than that group or individual. You might be the one cut from the team.

      You say “1 billion”. That’s an arbitrary number. I bet there are folks that say “1 million” and would have no problem seeing you scrubbed clean from this world. Especially if what you do for them can be done by someone or something else.

      You also apparently don’t care for actual data. Fact: The world population growth rate is slowing and has been slowing since the mid seventies.
      Fact: Teen pregnancies continue to drop.
      Fact: At least in the United States, instances of violent crime continue to decrease.

      I could go on and on. At the end of the day, what I say to people like you who have a problem with 7 billion people sharing this world is if you have a problem with overpopulation, be and example to us all and take yourself out first.

      Harsh, yes. But those 7 billion you talk about aren’t ants, they are your fellow living breathing human beings and are all deserving of life. If you can’t understand that, then that shows a disturbing lack of empathy and empathy is one of those evolutionary achievements humans have over other species. If you lack it, that to me is an indicator of subhumanity.

    • Nivek ABeemer330 April 7, 2013 on 7:57 pm

      So… Which 6 billion people would you like to kill?

  • ABeemer330 April 7, 2012 on 5:03 pm

    Anybody that thinks that we dont have an over population problem or that we need to control it, is plain ignorant of the obvious, and devoid of common sense. A prime example of unhealthy obsession with “sanctity of life”. Surviving is by any means. I would kill every single person on this planet to survive if I had to, anybody that wouldn’t go as far is someone that has lost the very thing that created higher life-forms, and shouldn’t be allowed to pass that inadequacy on….

    • turtles_allthewaydown ABeemer330 April 9, 2012 on 12:01 pm

      I disagree. There is a proven benefit in most cases to promoting the survival of the tribe over the self (assuming you have some common genes with the tribe). Obviously parents in many species are willing to sacrifice themselves for their offspring. Once you have children, your way of thinking changes.
      I’m guessing you’re a very materialistic single person.

      That said, yes, we do have a current overpopulation problem, and the Catholic church needs to change its policy and apologize for practices that keep their members poor and impoverished.

    • Bill ABeemer330 April 10, 2012 on 6:38 am

      There’s lots of creatures capable of killing for survival without compunction. All those creatures are considered less highly evolved than us. My cat would do as you describe if need be. Does that make him superior than me because I’d rather die myself than “kill every single person on this planet to survive”.

      It’s called empathy and it is a wonderful thing and a great species survival trait.

      And once again. I don’t know what you consider common sense, but if you’d care to analyze the data, you’d see that the planet is more than capable of handling the current population and more as technology progresses. But it won’t have to since, at the current rate of declining population growth, the world population is predicted by the UN to reach a maximum of 9.2 billion by 2050. And imagine what efficiencies technology will enable for us by 2050 at the current rate of progress.

  • Bill April 10, 2012 on 5:26 am

    It\’s useful to think through this question by asking what is \”the place of single-cell organisms post multicellularity.\”

    Some chose totally merge completely with the \’machine\’, like the enslaved, specialized cells that make up your body.

    Others live in or on multicellular creatures, scavenging, hitchhiking or being parasitical.

    Others have nothing to do with the multicellularity or its entities and remain rugged, individualistic, adaptable creaures. Untold trillions of single celled organisms go their entire existence without coming into contact with a multicellular organism. Many of which form loose associations or cooperate with eachother in some fashion to survive.

    I think the singularity and the multicellularity have an incredible amount in common. You have to really compare a microbes intelligence to your own to get the true scope of the difference between you and the intelligences that will emerge. Think about your power compared to theirs, their capability of understanding you, how they can still affect you or live off of you or be completely separate from what you are doing. AND how much time you bother thinking about them at all.

    Post Singularity analogies:

    Humans = Bacteria
    Nations = Bacteria Colonies
    Cyborgs = Eukaryotes
    AGI = Multicellular life of increasing degrees of capability and complexity
    Our Galaxy = The World

    Then you have to decide if you want to merge with them or not and to what degree. Some of it will be our choice, some of it will simply be swept into against our will or even without our knowledge that anything has happened.

    • MarcusAurelius Bill October 12, 2012 on 12:15 am

      I really like this analogy and its very apt at describing what may happen. There is no reason to think AI will assign a value to humankind or the earth at all. Not in the same way we assign value to the environment because we choose to associate with it in a very sentimental way. And it is relevant to us because without the ecosystem we cease to exist. However the Earth will continue to exist with or without an ecosystem, in much the same way Mars or Jupiter exist.

      What we need to understand is that distinction. How we as humans like to anthropomorphize everything from rodents and bunnies to robots. We really have no idea what kind of intellect or even the response they will give us. Wouldn’t it be funny if the moment we create intelligent machines, all they want to do is leave this rock and go venturing beyond our solar system? It would make us feel so small to believe Earth was so important in the first place, or that our existence would be threatened. Machines will be logical and not in a human way, perhaps more mathematically logical than scientifically logical. They may opt to see the whole universe as a closed system than just the Earth and hence want to pursue interests that encompass all of its entirety.

    • urusan Bill October 13, 2012 on 1:23 am

      I think this is a reasonably good analogy to the high level picture of the post-singularity future. However, I don’t think unmodified humans belong in the list at all. A human can’t survive in the broader galaxy and must live in a tiny film of matter perfectly suited to their survival. From this perspective, we seem more like the simple RNA replicators floating around in the primordial ooze that would eventually evolve into bacteria (at least according to one popular theory).

      Comparatively hardy transhumans like cyborgs and genetically modified humans will probably fit into the role of bacteria much better than unmodified humans. The position of eukaryotes will probably be filled by individual functional AGI nodes, with larger and more powerful AGIs being composed of many such nodes working in systematic harmony.

      Unless we make the preservation of unmodified humanity and the preservation of Earth’s environment priorities for our creations, they will likely destroy both and force us to either change or die (…or you know…just die).

      I’m pretty confident that things will turn out ok, because we will prefer sympathetic transhumans and AIs which will in turn prefer more sympathetic transhumans and AIs, but the seriousness of the issue should not be trivialized. Once we release the genie that is super-intelligence, superior forms life will out-compete unmodified humans at every turn until we are at a disadvantage…at which point we will have no say in what they do to us.

      • MarcusAurelius urusan October 15, 2012 on 5:38 am

        Wow that is quite an imagination you have going there. I like the idea of humans not quite being bacteria. It makes alot of sense since we barely have the fortitude and biological resilience to weather too many harsh environmental conditions. So yes we are like RNA to a point. On the other hand water bears or Tardigrades have tremendous resilience and are hardy creatures able to even survive space for days.

        It is interesting that you classify AGI’s in such roles comparatively to eukaryotes. Have you ever heard of Orion Arm fiction? It proposes a fictional timeline in which humanity has spread across the cosmos in a transhuman renaissance. Artilects roam the known galaxy founding entire civilizations like overseers of children and even have other higher artificial intelligences that remain invisible in a seemingly godlike position among lesser augments and immortals. It makes for very thought provoking reading. I recommend you give this link a look. There is quite a number of literature on that site but for the skinny just read the next 10 captions to get an idea.

        Again I thankyou for your ideas and that Reign of Steel link beforehand.

        http://www.orionsarm.com/xcms.php?r=oa-backstory&page=1

        • urusan MarcusAurelius October 16, 2012 on 8:55 pm

          It’s been months since I last looked at Orion’s Arm, but you got me reading it again.

          I think something like far future world of OA is likely after a long time in our own world. However, much depends on the earliest singularity and things could go very differently from how they ended up in that timeline.

          In particular, OA depends on the early victory of the pro-human AIs over the ahumans and anti-humans and various entities and technologies escaping Earth before the nanodisaster and the emergence of GAIA. An early anti-human victory would doom humanity; furthermore, regardless of its views on humans, a powerful single entity could ensure that it was the only power around if it could control the solar system in the early days and prevent anyone from developing further or leaving. Worse yet, imagine if one of OA’s blights was the first thing to emerge!

          Regardless of the benefits, a singularity is an inherently chaotic and potentially dangerous event. I personally think it is worth the risk involved, but we should definitely remain mindful of the risks just in case.

  • brian frawley October 10, 2012 on 1:18 pm

    If the singularity occurs and its logical that it will sooner or later then the human meat sack(that’s very vulgar, I quite like the human body) we find ourselves in will be upgraded too a more suitable material for the environments we find ourselves in when the time comes. The Luddites will be the only pure humans left and they can have a nice safe place to enjoy their existence, actually they have the whole planet because I’m going interstellar baby!:)

  • why06 October 10, 2012 on 8:41 pm

    I think sufficiently advanced AI will find themselves nearly as human as us. Limited by their constraints, unsure what to do, and the meaning of everything if anything. They will not think purely of efficiency like machines. They may even become much more philosophical than the majority of us.

    We will grow with them, they will grow with us. Just like any great partnership the two compliment each other. And now there is a third: Man, Woman, & Machine we will come to rely on each other and ultimately love each other.

  • furaferi October 10, 2012 on 9:29 pm

    I highly recommend checking out http://www.orionsarm.com

    It’s a very elaborate science fiction universe, where AIs are pretty much running the show ruling as quasi-gods, but there is much room left for lesser intelligences to do whatever they desire. I can get lost on that site for hours and most of the writing is pretty good and thought provoking.

  • urusan October 13, 2012 on 12:39 am

    Those worried about resources don’t seem to be accounting for the vast untouched resources of the Solar System. Post-singularity technology (particularly molecular nanotechnology) will reduce the resources we have to actually worry about to just three: energy, atomic matter, and space.

    The amount of energy the Earth recieves from the sun is staggering (10,875 times what humankind consumes today or enough for 76 trillion people at present rates of consumption), and the amount of solar energy we could generate if we created huge sun-orbiting solar power plants is pretty much unimaginable in modern terms (the sun outputs enough energy to sustain a population of 24 billion trillion humans at present rates of consumption). Of course, in reality we will likely see much higher rates of energy consumption in the future, and most of that pie will probably go to transhumans and AIs instead of humans, but the point is energy is not an issue even for human populations obscenely larger than those that exist today.

    Atomic matter can be recycled indefinitely using molecular nanotechnology, so a very small amount is needed per person. It may even become folded in with energy to some extent, depending on how good our nuclear transmutation technology gets. Every person alive today stands atop a cone of matter 3,185 kilometers tall.

    Space is likely to be the main limit to growth, but we have plenty. New York City has a population density of 10,518.6 per square kilometer, while the Earth has a population density of 47.3 per square kilometer on land (including the oceans, this drops to 13.8 per square kilometer). Thus, an Earth-spanning city would allow for a population of around 1.5 trillion (or including the oceans, 5.3 trillion people). Also, this only uses a tiny fraction of the Earth’s available space (only covering the surface of the Earth in a thin film without building up or down much), and also assumes we can’t pack people any closer than we can today (which would be quite comfortable if we spent most or all of our time living inside a virtual environment).

    Then there’s the rest of the Solar System. Each gas giant represents an unbelievably vast collection of all three resources. The amount of fusion power available in the 1.7 thousand trillion trillion kg of hydrogen present in Jupiter is simply staggering, and this fusion would yield heavier atomic elements that we could put to use. There’s plenty of space in outer space.

    The specific numbers aren’t all that important, as the unmodified human population will likely never get anywhere close to those numbers and the needs of transhuman and AI populations are much harder to predict. The point is that the true limits to growth are staggeringly high compared to where we are today. We don’t have to reduce the human population to some arbitrary number like 1 billion or worry excessively about the upcoming 10 billion milestone. Instead we should focus on preserving the sanctity of intelligent life of all kinds (at least, intelligent life that already exists) and more practically on opening up the resources that will allow us to continue to thrive in the future. This is the way forward to a happy future for all of us.

    If we don’t impart the importance of preserving existing intelligence to our creations, we’re likely to be culled by the first batch of super-intelligent entities we develop, leading to a world like Reign of Steel. http://en.wikipedia.org/wiki/Reign_of_Steel I highly recommend reading the brief history of that setting as it nicely sums up how this would realistically happen in a near future scenario. Obviously, if we don’t develop the technology to open up the vast resources available to us, then eventually we will face severe sustainability issues. Both are important if most of us want to survive and thrive into the future.

    In any case, despite my overall optimism about resources I think that there will eventually be population control of some sort in well-established areas. When we start living indefinite length youthful lives due to medical advances, population could get out of control extremely fast due to exponential growth. It wouldn’t be too hard on us though, as some sort of one-child rule would work fine even with indefinite lifespans (due to a low rate of death from accidents and such, and in fact we’d still need 2 children per couple to keep a stable population in the long run). Furthermore, there will likely be restrictions on cloning and the creation of new virtual intelligences for the same reason. In other words, it’s not just the human population that will be controlled.

    Those that desire unrestricted reproduction will move to colonize new areas of space, but such areas will eventually become well established too. There will also likely be small colonies that eschew life extension technology in exchange for the right to “unlimited” reproduction. These colonies would likely be kept limited in size by strong neighbors via resource restriction.

    As for the original question posed in the topic starter, the answer will depend on the AI(s) making the decision, and will probably fall into one of three categories: “as many as there are currently” (sympathetic AI), “a lot fewer or even zero, but the current ones can move into space” (extremely environmentalist but sympathetic AI), or “zero” (unsympathetic AI).

    • MarcusAurelius urusan October 14, 2012 on 9:03 pm

      I enjoyed reading Reign of Steel. I have often had my own ideas on an AI dominated world or a post apocalyptic scenario that goes further than the Terminator series in answering what exactly would AI’s be focused on doing in a post humanity world. It was quite an engaging read. I was actually dismayed to find it was a roleplay setting and not a novel in its own right. Because that would be a book I would pick up in a heartbeat. I am especially curious about Lunar and the so called fabled Tranquility AI and what would eventuate from that.

      I have sometimes wondered in a scenario like Reign of Steel, what is stopping Matrioska brain level artilects (huge AI intellects as coined by DeGaris) from eventuating. I could easily see some riveting reading if such a scenario was to develop. And a sort of AI wars where each AI would pit their enormous intellects upon one another in an effort to dominate the entire solar system. Wouldn’t it be interesting to see Lunar and Tranquility merging to making its intelligence pervade the whole lunar surface and subsequently thwarting Orbital and the other AIs? Maybe even a seedling AI from the Tranquility Lunar merger would seed yet another expansive colony of AIs on Mars for instance and beyond and come to the realisation that there is no upper limit on how large they can grow in space.

      Your analogies of city size and comparison per city type is intriguing. I think its true that the overpopulation problem is a largely a human limitation to imagination and creativity. And moreso a limitation of technology, collaboration and human endevour going hand in hand. Its quite possible that in the far future we will see things like the Borg, as we assimilate entire worlds to become suitable hosts to our ever expanding intellectual powers. And that real estate will be only limited by energy. In saying that I hope we will be more aesthetically minded than the Borg and not infringe upon any lifeforms right to exist in their original manner. In this sense I see future transhumanist building things of truly epic proportions in an effort to house everything imaginable. From soft AI, to strong AI to augmented intelligence, to godlike artilects, to unmodified humans, to augmented animal intelligence, to basically anything you can think of all residing under one sun peacefully. All being housed in a dysons sphere with arcology type city sized buildings that will be seen as the pinnacle of intelligence and engineering capability.

      • urusan MarcusAurelius October 16, 2012 on 7:53 pm

        Hey, there’s nothing to stop you from writing a Reign of Steel novel that explores those ideas! The whole point of a roleplaying worldbook is to lay out the details of a world so that creative new stories can be developed in that setting.

        Well, ok…I guess there are copyright issues, but the copyright holders would likely be interested in a novel if you went to them with it before publishing it. :P

  • Stefano Crivellari October 15, 2012 on 6:54 am

    AI will recognize the important to provide humanity with relevant education and will affect how humans learn about their natural contest, hopefully establishing again a connection that was absent previously in an economy which promotes competitive behavior and cyclical consumption of resources. I think that in order to technology to save us, our education system must be recalibrated to an undistorted set of values, maybe through learning and teaching technologies with higher priority.

  • 5ynic October 15, 2012 on 9:03 pm

    thread was a bit tl;dr, so I skimmed some of it.
    I’d chime in around the “just before the population began to ramp exponentially at the industrial revolution” mark (which would give us a nice fat safety margin of the kind probably preferred by very-long-lived AIs). That’s somewhere around 80M-250M.
    Strong caveat: we’d want to get back down to that number (slowly, nicely, no famines or megadeaths – should be doable through incentives for breeding less) without losing much diversity. That means primitive tribal peoples (especially in Africa – home to some 90%+ of out total genetic diversity IIRC) would want to ensure their population finished up somewhere around the proportion of the total that it was before the industrial revolution, while the populations that have exploded from a relatively small base to fill continents (the Han, Caucasians…) would remain the largest groups but need to be reduced relatively more (again, no icky eugenics or nasty racial politics required, for the simple reason that given our current starting point, any difference to the treatment of the larger groups brought about by the need to safeguard diversity would only have an effect somewhere off to the right of the decimal point when figuring out the incentive schemes required).

  • tkelly2000 October 16, 2012 on 7:19 am

    Before you answr this question you must consider what makes up a “good” ecology. Is it balance? If so, you may have a stagnet growth medium. Maybe a “good” ecology provides stress in order to drive evolution. In that case the AI may just consider us a forcing function on planetary evolution. A very necessary component.

  • Jorled March 23, 2013 on 9:00 pm

    Is there some higher moral good than the “good of Humanity”? Is there some higher “ecology” that might call for the dimunization, or elimination of humanity ? Could that version of the holocaust serve a more moral purpose….should bioforms step aside, or be pushed, from the future path of evolution? True power is taken, not given, will it be ours to decide? should we resist ..or welcome our new overlords……Will the zoo keepers be kind?

    • Nivek Jorled April 7, 2013 on 8:14 pm

      Resistance is a moral imperative for survival of the species.

  • Nivek April 7, 2013 on 7:52 pm

    Zero. This movement will bring about the extinction of Homo Sapiens Sapiens.

  • Alkan June 3, 2013 on 2:42 pm

    As soon as I get a ticket off this planet and some good ol’ immortality, I’m going to say goodbye and only occasionally visit on vacation to reduce my impact. I’ll fund my own trip out of here, thanks to Elon Musk.

    Just so everyone knows how seriously I take this impact reduction, I actually do feel quite sad about it, but believe it’s necessary, since I’m not interested in being uploaded into virtual reality.

  • oakshade June 5, 2013 on 10:30 am

    Hi All
    You might like to catch up on the ideas and thoughts found in Ian M. Banks – Culture Series. He is several light years ahead of Ray K. But sadly he is not going to make the Singularity but he really explores a post- singularity future.

    Listening to will.i.am’s comments on marriage of art and science in context of Hollywood and Silicon Valley. I think the Culture Series by Banks could become the flagship creative construct for where SU is heading in C21.

    Check out:
    http://www.tor.com/blogs/2013/04/iain-m-banks-culture-spits-in-the-eye-of-nihilism

    • Adam Sherfield oakshade June 25, 2013 on 10:23 pm

      I don’t get it. Why is survival important to us? Why perpetuate our species, or any organism for that matter? We assign value to eachother and to the world around us based on emotions, which are fundamentally biological and consequently maleable. If the future follows the course of logic, we won’t have any place in it, and that’s okay because we won’t be around for it not to be okay. I guess. Anyone tried the new DiGiorno frozen pizzas? I’m wondering if they taste as good as the old ones.