ADVERTISEMENT
  About the SA Blog Network













The Curious Wavefunction

The Curious Wavefunction


Musings on chemistry and the history and philosophy of science
The Curious Wavefunction Home

Theories, models and the future of science


Email   PrintPrint



Dark matter and dark energy: Models for accounting for the distribution of matter and the acceleration of the universe (Image: Edelweiss)

Last year’s Nobel Prize for physics was awarded to Saul Perlmutter, Brian Schmidt and Adam Riess for their discovery of an accelerating universe, a finding leading to the startling postulate that 75% of our universe contains a hitherto unknown entity called dark energy. This is an important discovery which is predated by brilliant minds and an exciting history. It continues a grand narrative that starts from Henrietta Swan Leavitt (who established a standard reference for calculating astronomical distances) through Albert Einstein (whose despised cosmological constant was resurrected by these findings) and Edwin Hubble, continuing through George Lemaitre and George Gamow (with their ideas about the Big Bang) and finally culminating in our current sophisticated understanding of the expanding universe.

But what is equally interesting is the ignorance that the prizewinning discovery reveals. The prize was awarded for the observation of an accelerating universe, not the explanation. Nobody really knows why the universe is accelerating. The current explanation for the acceleration consists of a set of different models incorporating entities like dark energy, none of which has been definitively proven to explain the facts well enough. And this makes me wonder if such a proliferation of models without accompanying concrete theories is going to embody science in the future.

The twentieth century saw theoretical advances in physics that agreed with experiment to an astonishing degree of accuracy. This progress culminated in the development of quantum electrodynamics, whose accuracy in Richard Feynman’s words is equivalent to calculating the distance between New York and Los Angeles within a hairsbreadth. Since then we have had some successes in quantitatively correlating theory to experiment, most notably in the work on validating the Big Bang and the development of the standard model of particle physics. But dark energy- there’s no theory for it as of now that remotely approaches the rigor of QED when it comes to comparison with experiment.

Of course it’s unfair to criticize dark energy since we are just getting started on tackling its mysteries. Maybe someday a comprehensive theory will be found, but given the complexity of what we are trying to achieve (essentially explain the nature of all the matter and energy in the universe) it seems likely that we may always be stuck with models, not actual theories. And this may be the case not just with cosmology but with other sciences. The fact is that the kinds of phenomena that science has been dealing with recently have been multifactorial, complex and emergent. The kind of mechanical, reductionist approaches that worked so well for atomic physics and molecular biology may turn out to be too impoverished for taking these phenomena apart. Take biology for instance. Do you think we could have a complete “theory” for the human brain that can quantitatively calculate all brain states leading to consciousness and our reaction to the external world? How about trying to build a “theory” for signal transduction that would allow us to not just predict but truly understand (in a holistic way) all the interactions with drugs and biomolecules that living organisms undergo? And then there’s other complex phenomena like the economy, the weather and social networks. It seems wise to say that we don’t anticipate real overarching theories for these phenomena anytime soon.

Molecular models - such as that of a ribosome depicted here - are already an integral part of chemistry and biology (Image: MRC)

On the other hand, I think it’s a sign of things to come that most of these fields are rife with explanatory models of varying accuracy and validity. Most importantly, modeling and simulation are starting to be considered as a respectable “third leg” of science, in addition to theory and experiment. One simple reason for this is the recognition that many of science’s greatest current challenges may not be amenable to rigorous theorizing, and we may have to treat models of phenomena as independent, authoritative explanatory entities in their own right. We are already seeing this happen in chemistry, biology, climate science and social science, and I have been told that even cosmologists are now extensively relying on computational models of the universe. My own field of drug discovery is a great example of the success and failure of models. Here models are used not just in computationally simulating the interactions of drugs with diseased proteins at a molecular level but in fitting pharmacological data and x-ray diffraction data, in constructing gene and protein networks and even in running and analyzing clinical trials. Models permeate drug discovery and development at every stage, and it’s hard to imagine a time when we will have an overarching “theory” encompassing the various stages of the process.

Admittedly these and other models are still far behind theory and experiment which have had head starts of about a thousand years. But there can be little doubt that such models can only become more accurate with increasing computational firepower and more comprehensive inclusion of data. How accurate remains to be seen, but it’s worth noting that there are already books that make a case for an independent, study-worthy philosophy of modeling and simulation; a recent book by the University of South Florida philosopher Eric Winsberg for instance extols philosophers of science to treat models not just as convenient applications and representations of theories (which are then the only fundamental things worth studying) but as ultimate independent explanatory devices in themselves that deserve separate philosophical consideration.

Could this then be at least part of the future of science? A future where robust experimental observations are encompassed not by beautifully rigorous and complete theories like general relativity or QED but only by different models which are patched together through a combination of rigor, empirical data, fudge factors and plain old intuition? This would be a new kind of science, as useful in its applications as its old counterpart but rooting itself only in models and not in complete theories. Given the history of theoretical science, such a future may seem dark and depressing. That is because as the statistician George Box famously quipped, although some models are useful, all models are in some sense wrong. What Box meant was that models often feature unrealistic assumptions about the details of a system, and yet allow us to reproduce the essential features of reality. They are subject to fudge factors and to the whims of their creators. Thus they can never provide the certain connection to “reality” that theories seem to. This is especially a problem when disparate models give the same answer to a question. In the absence of discriminating ideas, which model is then the “correct” one? The usual, convenient answer is “none of them”, since they all do an equally good job of explaining the facts. But this view of science, where models that can be judged only on the basis of their utility are the ultimate arbiters of reality and where there is thus no sense of a unified theoretical framework, feels deeply unsettling. In this universe the “real” theory will always remain hidden behind a facade of models, much as reality is always hidden behind the event horizon of a black hole. Such a universe can hardly warm the cockles of the heart of those who are used to crafting grand narratives for life and the cosmos. However it may be the price we pay for more comprehensive understanding. In the future, Nobel Prizes may be frequently awarded for important observations for which there are no real theories, only models. The discovery of dark matter and energy and our current attempts to understand the brain and signal transduction could well be the harbingers of this new kind of science.

Should we worry about such a world rife with models and devoid of theories? Not necessarily. If there’s one thing about science that we know, it’s that it evolves. Grand explanatory theories have traditionally been supposed to be a key part- probably the key part- of the scientific enterprise. But this is mostly because of historical precedent as well a psychological urge for seeking elegance and unification. And even historically sciences have progressed much without complete theories, as chemistry did for hundreds of years before the emergence of the atomic and structural theories. The belief that a grand theory is essential for the true development of a discipline has been resoundingly validated in the past but it’s utility may well have plateaued. I am not advocating some “end of science” scenario here – far from it – but as the recent history of string theory and theoretical physics in general demonstrates, even the most mathematically elegant and psychologically pleasing theories may have scant connection to reality. Because of the sheer scale and complexity of what we are trying to currently explain, we may have hit a roadblock in the application of the largely reductionist traditional scientific thinking which has served us so well for half a millennium

Ultimately what matters though is whether our constructs- theories, models, rules of thumb or heuristic pattern recognition- are up to the task of constructing consistent explanations of complex phenomena. The business of science is explanation, whether through unified narratives or piecemeal explanation is secondary. Although the former sounds more psychologically satisfying, science does not really care about stoking our egos. What is out there exists, and we do whatever’s necessary and sufficient to unravel it.

This is a revised version of a past post.

Ashutosh Jogalekar About the Author: Ashutosh (Ash) Jogalekar is a chemist interested in the history and philosophy of science. He considers science to be a seamless and all-encompassing part of the human experience. Follow on Twitter @curiouswavefn.

The views expressed are those of the author and are not necessarily those of Scientific American.





Rights & Permissions

Comments 20 Comments

Add Comment
  1. 1. Sisko 2:13 pm 09/5/2012

    There is nothing magical about the use of models. They are the tools of science and engineering. The key is that models need to be verified as accurate based upon observation and need to provide repeatable results.

    One field of science where models have been used incorrectly is in the field of climate science. Many in the field of climate science have been using models called general circulation models to predict what a warmer world will look like and then to tell people how they should live.

    Upon examination these GCM’s do a very poor job of accurately forecasting future conditions. Unfortunately, this has not stopped some so called climate scientists from telling people to take actions based on the failed models. That is bad science and bad policy!

    Link to this
  2. 2. FB3636 3:49 pm 09/5/2012

    Thanks to exponential increase in computer power, actually testing many scientific models becoming possible. That is the future of science.

    For example, if we accurately simulate a planetary nebula around a star, can we really see rocky planets, gas giants, asteroid regions etc. getting created or the current models have any flaws?
    Or if we simulate individual atoms down to protons and neutrons or even quarks, can we verify all chemistry and nuclear reactions are happening as the way expected?…

    Only way to know for sure is computer simulation.
    I am sure many realistic computer simulations will show great surprises and earn Nobel Prizes in the future.

    Link to this
  3. 3. tharter 4:35 pm 09/5/2012

    Actually Ash, there’s a rather large difference between a theory and a model in one respect (and none in other ways). A theory is fully generalizable. You can apply Newtonian mechanics to any system of masses and spit out answers. Models are specific. Fully generalized theories have ‘unity’, that is all the assertions of the theory fit together as a single whole, while models generally lack this feature (and in fact I’d venture that this is definitional since otherwise they’re both mathematical systems).

    Link to this
  4. 4. Quantumburrito 4:44 pm 09/5/2012

    tharter: Yes, I completely agree and that’s precisely what I am wondering about, whether we will ever have generalized theories that describe systems as complex and emergent as the economy or the brain. If we don’t we may be stuck with models that describe these systems in a piecewise manner.

    Link to this
  5. 5. Bora Zivkovic 5:27 pm 09/5/2012

    A model can be a) general, b) precise, or c) realistic. It can to some extent be two out of the three. It cannot be all three. One needs to know what is needed, then make the model that fulfills that goal.

    Link to this
  6. 6. julianpenrod 8:00 pm 09/5/2012

    This may end up being removed because I am boing to disagree with and even criticize the blogger. They already removed the comment I posted at 4:40 because I showed flaws in “the official story” that they want everyone to believe. But there are a number of points that need to be made.
    Among other things, to even refer to economics as within the bounds of “science” disqualifies this entire article! They use a lot of formulas, but none of them ever apply. They win Nobel Prizes, but nothing is ever put to obvious use. They claim to have “explained” many things, but that’s all they do, claim that they “explained” them! They did nothing to stop the sub prime situation! They are not observers siting back and watching this huge thing called the “economy” gyrate in front of them. All the money or surreptitiously awarded proxy control over all the world’s wealth is in the hands of a few people. Ownership, control, influence are all stratified and, in any finite population, any stratification must have a top most level. They do what they want and claim it’s part of “the normal workings of the ‘economy’”, then “economists” buold elabroate fabrications to “explain” why things this time acted completely differentl from the way they acted a week ago!
    With this kind of pedigree, it’s not surprising that every other part of the article should be eminently questionable.
    For example, and it will eventually be acknowledged, the Perlmutter, Schmidt, Reiss award is not for establishing that the universe is accelerating! Their own “results” were that galaxies five billion light years away are moving faster than the Hubble Constant based on nearer galaxies would expect. That means nearer galaxies are moving more slowly than those five billion light years away. But the galaxies nearer to us are nearer in time, too, later in the age of the universe! That means five billion years ago, galaxies were moving faster, but now, they’re slower! And that means the universe is slowing, not accelerating!
    And quantum electrodynamics is a “success” only in its unchallenged propaganda. Among other things, how unusual to hear of a field defined by the idea of uncertainty claiming unprecedented accuracy? How many noticed that, if Einstein is right that mass is equivalent to energy, measurements of mass should be as subject to uncertainty as anything else in quantum theory?
    And what a curious turn of phrase that “advances in physics agreed with experiment to an astronishing degree of accuracy”! New theories are supposed to agree with experiment! That’s how “science” works! This is praising “science” for doing what it’s supposed to do! As for making predictions that came true, are we supposed to believe, then, that there were absolutely no cases whatsoever where “experiments” in, say, quantum theory didn’t disagree with predictions? That no new claimed mechanisms or interactions were discovered by “experiment” not agreeing with prediction? Or is this using the hoary tactic of acknowledging only those cases where everything went right and acting like all “science” is like that?
    It’s unabashed gloating like this that leads many to trust things like the “experiment” that “proved” cell phones cause accidents by saying, if you took a call at a restaurent, then got out in your car and a doped up driver smashed into you, it was your use of the cell phone that caused the accident!

    Link to this
  7. 7. curiouswavefunction 8:30 pm 09/5/2012

    julianpenrod: It’s a little hard to see the substance of your comment through that thicket of words (which was likely why it was caught in the spam filter before) but I am not sure where you are going with this. While I have some sympathy with the economics = not science viewpoint, I don’t subscribe to the economics = hokum school of thought either. It should be possible at least in principle to quantify the transactions between rational entities and at least try to estimate the degree of uncertainty arising from such analyses. And economics Nobel Prize are awarded as much for analysis as for concrete results and predictions. As for the physics Nobel Prize, perhaps you should take it up with the Nobel committee if you think the prizewinning discovery indicated the opposite of what it was supposed to. Also, not all theories in science are created equal with respect to agreement to experiment.

    Link to this
  8. 8. Bryan Sanctuary 9:17 pm 09/5/2012

    I think that models are a part of science and always have been. Basically if you make some assumptions, you have a model. Problem is that with some systems there are too many relevant parameters.

    I like the adage: The more you know about physics, the simpler it becomes and the more you know about biology, the more complicated it becomes. So in the end I think the dark matter problem and accelerating universe will have a simple explanation, but I do not think we will find a simple way generally understanding the functions of drugs.

    Link to this
  9. 9. julianpenrod 10:46 pm 09/5/2012

    It is ironic that curiouswavefunction should describe my comment as a “thicket of words”, given that the blog article is about twice as long. But whatever the length, it is part of what I have complained about before, that too many see something as “too hard” simply because it is more than a certain number of words, and, given the modern prevailing attitude that “what\ever we do today is genius”, they summarily condemn something as “irrelevant”, “meaningless”, simply because it has what they consider too many words.
    And, yet, the later Harry Potter books could match L. Ron Hubbard pieces page-for-page, but how many call them gratuitously overlong?
    And, as for “economics”, if religion had the horrendous track record of “economics”, claiming to “explain”
    everything but never able to repeat the situation, always having a different “explanation”, never able to make a prediction but always claiming after the fact what purportedly happened, it would be denounced soundly as a betrayal of the trust of mankind! Not once, ever, did “economics” make a single prediction that the “rank and file” can use. It is an accepted truism of “science” that, at some point, any theory has to provide something inmmediately useable. In point of fact, “economics” does qualify as a fraud. Because its core premise, that the flow of money has innate and inherent properties independent of all, is a lie. The rich control the money.
    They manipulate stock prices.
    They hold up introduction of valuable new products.
    They discontinue products that are making them money. Low Fat Premium Crackers was a big seller. But Nabisco discontinued them. Their popularity is obvious by the fact that house brand supermarket fat free saltines still sell briskly! Every “law” of “economics” is against Nabisco dropping a profitable brand, but they did it.
    A bowling alley near me, immensely popular with kids, was torn down about a decade ago, against all “laws” that you don’t get rid of something that is popular and a money maker. Since then, kids congregate everywhere in tiny clutches doing nothing. A characteristic case of “an open market”, but no one has taken the initiative to answer that need, against the “laws” of “economics”.
    The rich wait until an employee’s pension is ready to kick in, then fire them to avoid paying! They fire someone just before they are due for a pay increase, and agree to rehire them only at starting salary! They manufacture cars with unnaturally high trunk roofs and abnormally narrow rear windows to force motorists to buy expensive closed circuit rear view television! They still employ “programmed obsolescence”! They arrange with craven poltiicians to pass laws mandating purchase of junk product!
    How anyone can even hear the phrase “creative bookkeeping” and then say that “economics” theories based on those “creative bookkeeping lies” are legitimate completely undermines everything they say.

    Link to this
  10. 10. FB3636 2:15 am 09/6/2012

    To julianpenrod:
    First of all if it was really possible to interpret
    distant supernova redshift measurements as
    “Universe is decelerating” then surely physicists
    would see that long ago.

    The question is what is the error in your logic.
    I may have an explanation if you could think on it
    carefully:

    Assume there is only Earth and Sun in the Universe
    and Universe started expanding and Sun keep getting
    farther away from Earth.
    Assume we are keep measuring the redshift in sunlight
    to see how fast is the expansion is.

    Obviously if the redshift stays the same that means
    expansion speed is constant and if it keeps increasing
    that means the expansion is accelerating.

    Now think about this:
    Would not you agree that it would be completely equivalent
    if we measured the light only from a single star (sun) at
    different times as the Universe gets older
    (as we assumed in above example), or if there where many
    more stars in the universe at different distances from
    Earth (which means seen in different times of the Universe)?

    If you agree both are equivalent then assume we made the
    measurements from different stars which
    giving light from different times of the Universe’s history and we found farther ones has more redshift, then is not it means the expansion is accelerating?

    Link to this
  11. 11. dadster 7:24 pm 09/6/2012

    julianpenrod is certainly right on his views that economics is not a science at all ,with which many might readily agree for all the reasons julianpenrod has rightly trotted out in unambiguous terms.

    The more one think of it, the more one feels that economics is nothing but the ways and means of looting the public by the “rich investors”. Its all about the art of “of” money making “by” the rich ,”for” the rich, using existing resources including psychological and political and/or inventing new ways and means of ripping off the public . The respectability of Nobel prize gained through its association with material sciences has been hijacked by the economists to give credence to their wiles and guiles. unfortunately money influences government decisions and of all others who wield power and politicians and even media who are otherwise supposed to be the voice of the 99%.Even ordinary weather forecasts are far better at prediction than Nobel laureate economists whose predictions have only benefited themselves and their 1% rich coterie leading the rest of the world to disaster and desolation. Nobel laureate economists and their tribe have been like the “Walrus and the Carpenter” in Lewis Carroll’s poem of the same name beguling and leading the unsuspecting oysters up the garden path to their annihilation en masse .

    As the Material scientists are concerned , so long as they stuck to explaining what was quantifiable and measurable in the cosmos ,they were doing fine.Only when they became over-ambitious and started to exceed their brief in trying to embrace what was neither quantifiable nor measurable,that they started to slip and fall. Material sciences, in over-stepping their limits trying to bring life-sciences and non-electromagnetic energies like gravitational energy ( which is, as per one mathematical model of cosmos,due to the so called warps and distortions in the fabric of space-time ),into their umbrella that they found themselves out of their depths.
    The quality of “Life” is a non-material type of quality of cosmos since it possess the characteristic of “will” or an “instinct of survival” and “self-organization” as opposed to the principle of entropy or systemic disorganization or evolving in time to a state of ultimate stability, that material energies in a closed system tend to culminate in and settle down which non-material energies like the “Vacuum energies” do not obey.The fact that Life manifests through matter does not make “life” and “matter” are of the same genre ;its rather like the fact that we cannot see light unless it gets reflected from matter but that doesn’t make light the same as matter though there is a conversion factor that gives the measurable quantity of the maximum electromagnetic energy “e” that could be “extracted” from any particular amount of mass “m” in the form of the conversion equation e=mc^2. But it would be totally un-scientific to rule out the existence of other forms of non-electromagnetic , non-extricable energies lying in the region of e>mc^2,associated with the same amount of mass ‘m”.

    Other cardinal principles of material sciences such as the unacceptability of “action at a distance”,and insistence on the causative principle that mandates on a one-to-one relationship between cause and effect are also overthrown at quantum levels where the continuity of energy waves and the discreteness of matter merge and, matter and non-matter becomes indistinguishable and the very concept of objective “measurement” loses its meaning,mind or intention of the observer starts influencing and creating reality itself and concepts like “pilot waves”and “quantum entanglements” enters to shape our reality.

    Non-electromagnetic energies such as “life-energy” ( or bio-energy or energy with a mind or “information”),”dark energy,vacuum energy and even “dark matter” with which electromagnetic energies cant even interact becomes the very fundamentals of cosmos relegating “matter”into the narrowest realms of “emergent phenomena” where it rightly belongs to.

    A word about Electromagnetic energy (e). The ‘e’ of Einstein’s equation is the very limit of matter-based material energies which is set at e-mc^2, by the master material scientist ,Einstein himself, beyond which lay the vast region of e> mc^2, the region of the said non-material energies. There the speed of light is not a limiting speed; perhaps the “speed of mind”, for example, may exceed the speed of light itself. Of course, mind and matter are totally different fields the former transcending material physics but still very much that which creates reality itself even as per material quantum scientists like Niels Henrik David Bohr ,the father of quantum science . But he had wisely advised that material scientists should NOT deal with anything that cannot be quantified and measured. Only when his sound advice was flouted by creating branches of material science such as “bio-physics” and “bio-chemistry” in their attempts to hijack funds for life-sciences by material physicists that physicists had to leave physical experiments and rely on models (mathematical or computer simulations),to translate reality into virtual-reality thereby losing reality itself in translation.

    Its high time that bio-sciences separate themselves from the harness of material scientists and effect a paradigm shift in their procedures and thinking to make progress in bio-sciences .
    IT is hard to understand why life scientists shy away from re-inventing “elan vital” ( or,Vital force postulated by the French philosopher Henri Bergson in 1907), even when material physicists in their field could readily rehabilitate Einstein’s greatest blunder, “the (purely arbitrary) cosmological constant” ,when it became a possible solution to explain certain physical aspects of cosmos ?

    Link to this
  12. 12. julianpenrod 7:26 pm 09/6/2012

    Everyone seems so determined to abandon all meaning to accept what Perlmutter is climing!
    To begin with, with respect to FB3636′s comment, for the universe ‘s expansion to “accelerating” means that it must have sped up at some time in the past and be continuing to this day. That’s what accelerating means. Objects far away are taken to also be being seen in the past, galaxies closer in are defined as being seen more recently. Galaxies five billion light years away are defined to be doing what they did five billion years ago, galaxies ten million light yeaars away are doing what they did ten million years ago. Perlmutter says the galaxies five billion years ago are moving faster than the Hubble Constant asserts. The Hubble Constant derived from the motions of galaxies close to us, that is nearer in time, that is, later in the supposed history of the universe. In other words, galaxies close to us, later in hte history of the universe are not receding with a “Hubble Constant” as large as the “Hubble Constant” the galaxies five billion years ago were! That means that the expansion rates now are not as large as they were! That’s what they mean when they say the galaxies five billion light years away are receding faster than they should! The galaxies five billion years ago were responding to a Hubble Constant larger than they are responding to today! That means they are not accelerating, they are decelerating.
    Even their “analytical” method raises eminent questions, if not completely disqualifies what Perlmutter says. They claim to have measured the brightening curve of Type 1a supernovas in the galaxies, to see how bright they became. Ignoring that they may be different chemically from supernovas now or that they may be undergoing time dilation or that they may only be in the line of sight and not in the galaxies, the length of brightening should give their absolute brightness. Comparing that to their apparent brightness should give their distance. Perlmutter himself described the supernovas as proving the galaxies “are further away than they should be”. But what informs them of where the galaxies should be? The Doppler shift! They measured the Doppler shift of the light from the stars and, using the nearby Hubble Constant, they found the galaxies should be closer. But the light is dimmer and so indicates the galaxies are further away. But the Doppler shift tells you how fast the galaxy is receding! In other words, the speed of the galaxies, indicated from their Doppler shift, is lower than what is “concluded” from the brightness of the supernovas in them. In othe words, they are moving faster than they are moving! You have to give up all reason to accept what Perlmutter says!
    Just because “scientists” don’t admit it, just because they don’t give it a legitimate interpretation doesn’t mean anything. Nothing verifies that “scientists” are honest. The fact that many don’t get to work in high profile “science” establishments is consistent with those in influential positions only allowing those in who are “the kind of crooks they can work with”. And shills might ask how could the “scientists” be sure this kind of fraud would work. For the most part, it is working. The proof is conclusive that Perlmutter’s claim is illegitimate, but so many accept what Perlmutter says only because Perlmutter said it! They give him a Nobel Prize and “science” devotees automatically shut down their discriminatory abilities and repeat what they are told to believe! To a large degree, the fraud is working!

    Link to this
  13. 13. Sterndorf 9:52 pm 09/6/2012

    Thoughtfully prepared article, and well written. Thank you.

    Its my opinion model use has become a pandemic.
    I’m glad someone with a wide readership is questioning their use. (I wonder if there is a related decline in hands-on experimental research.)

    Here’s an eye opening caution on using models.

    http://www.cosmologyscience.com/glossary.htm#Model

    Two things there are worthy of discussion.

    The first “. . . a model almost never reaches the threshold of a scientific hypothesis or scientific theory.” bears directly on this article’s questions.

    Then “Models are best used as education tools rather than predictive tools.”

    My only suggestion for improving this article is that the term “theory” is widely used, when “hypothesis” is
    the correct term for a new concept.

    Link to this
  14. 14. dadster 10:56 am 09/7/2012

    Julianpenrod ,this one again for you .The energy content of our Universe. Dark matter (23%),dark energy( 73%) remains an enigma.So, can we get rid of it, yet holding on to the theory of “cosmic acceleration”,a theory, which is based on experimental observations?

    Since we can depend upon models and thought experiments from now on,in science,with equal or better validity than observations from material experiments, here is a suggested thought process, through which we do away with the need to assume dark matter or dark energy at all,just like Einstein did with aether”, the material that fills the region of the universe above the terrestrial sphere “.

    When we observe from earth we infer from red shift that distant galaxies are accelerating away from us .We further infer that they are our past,ergo, we are their future there-by concluding that the so-postulated big bang occurred somewhere at the middle of the distance between us two, the energy of the big bang radiating itself outwards and away from a “central point of burst”,ejecting radially and symmetrically all matter and non-matter with their constituents. We also observe that with respect to the nearer galaxies, the red shift indicates slower rates of receding .As we reach very nearby distances we observe that there is little or no increase in the distances happening.From symmetry and homogeneity of space-time, we conclude that from any other point in the universe also their observations will give them the same results .There could be many possible explanations for this.But the simplest one that would mesh with the majority of other explanations is the conclusion that the universe is expanding at an accelerating pace like a balloon and all material galaxies are on the surface of the energy-balloon.

    Since gravity is the effect of space time distortions this “acceleration-effects” could be ascribed to the ironing out of the creases of space-time fabric due to stretching out of the fabric of the balloon thereby freeing galaxies from the attractions brought about by the crease and folds of space-time fabric (ie was reckoned as force of Gravity ), that kept the galaxies together . The effect of this freeing up of the galaxies would be little for nearby galaxies as the nearby galaxies tend to move more together uniformly even to the point of ultimate shredding of the fabric into smaller pieces later just like two nearby ink-dots may still remain close together even in the pieces of a burst balloon.

    Universe inherently possessing holographic fractal properties ,ultimately the fabric of space time in which those smaller fragments of matter exists would also “expand”and disintegrate into quarks or whatever smallest entities it can disintegrate into to attain superficial stability as there is no deep stability existing in the universe which would finally merge with the ever vibrant quantum vacuum energy or the ultimate information pool of an all pervasive cosmic awareness to burst forth again (owing to spontaneous random energy fluctuations of the quantum vacuum), to bloom and blossom into yet another set of multiverses .Poetry ? No,its simple prose based on modelling the cosmos on balloons , soap bubbles and foams and the credible concept of “quantum vacuum energy” and its random spontaneous fluctuation properties.

    Hurrah ! we have successfully got rid of the mysterious “dark forces” or dark “energy’ or dark matter ” or “dark anything” by holding on to the analogy of a balloon! The universe will ultimately become smithereens of super-strings or atto-particles or just vibrations of energies, whatever and, merge back with the ever vibrating vacuum energy to pop up again into foams and bubbles of multi-verses.

    Someone like Henri Poincaré who provided the mathematical framework for Einstein’s Special Relativity can come up with the “mathematics” for this model that did away with “dark matter” and “dark energy”.

    Link to this
  15. 15. julianpenrod 12:47 pm 09/7/2012

    It’s now about 12:40 p.m., September 7, the day after I tried to place this comment. It’s more than 17 hours since I tried to place it, and it still hasn’t appeared. This is shorter than the one curiouswavefunction tried to call a “thicket of words”. There has been more than enough time for this to pass any kind og “spam filter”. This may cause this to be removed, but it seems evident that Scientific American doesn’t want genuine questioning of what the people are ordered to believe.
    Everyone seems so determined to abandon all meaning to accept what Perlmutter is climing!
    To begin with, with respect to FB3636′s comment, for the universe ‘s expansion to “accelerating” means that it must have sped up at some time in the past and be continuing to this day. That’s what accelerating means. Objects far away are taken to also be being seen in the past, galaxies closer in are defined as being seen more recently. Galaxies five billion light years away are defined to be doing what they did five billion years ago, galaxies ten million light yeaars away are doing what they did ten million years ago. Perlmutter says the galaxies five billion years ago are moving faster than the Hubble Constant asserts. The Hubble Constant derived from the motions of galaxies close to us, that is nearer in time, that is, later in the supposed history of the universe. In other words, galaxies close to us, later in hte history of the universe are not receding with a “Hubble Constant” as large as the “Hubble Constant” the galaxies five billion years ago were! That means that the expansion rates now are not as large as they were! That’s what they mean when they say the galaxies five billion light years away are receding faster than they should! The galaxies five billion years ago were responding to a Hubble Constant larger than they are responding to today! That means they are not accelerating, they are decelerating.
    Even their “analytical” method raises eminent questions, if not completely disqualifies what Perlmutter says. They claim to have measured the brightening curve of Type 1a supernovas in the galaxies, to see how bright they became. Ignoring that they may be different chemically from supernovas now or that they may be undergoing time dilation or that they may only be in the line of sight and not in the galaxies, the length of brightening should give their absolute brightness. Comparing that to their apparent brightness should give their distance. Perlmutter himself described the supernovas as proving the galaxies “are further away than they should be”. But what informs them of where the galaxies should be? The Doppler shift! They measured the Doppler shift of the light from the stars and, using the nearby Hubble Constant, they found the galaxies should be closer. But the light is dimmer and so indicates the galaxies are further away. But the Doppler shift tells you how fast the galaxy is receding! In other words, the speed of the galaxies, indicated from their Doppler shift, is lower than what is “concluded” from the brightness of the supernovas in them. In othe words, they are moving faster than they are moving! You have to give up all reason to accept what Perlmutter says!
    Just because “scientists” don’t admit it, just because they don’t give it a legitimate interpretation doesn’t mean anything. Nothing verifies that “scientists” are honest. The fact that many don’t get to work in high profile “science” establishments is consistent with those in influential positions only allowing those in who are “the kind of crooks they can work with”. And shills might ask how could the “scientists” be sure this kind of fraud would work. For the most part, it is working. The proof is conclusive that Perlmutter’s claim is illegitimate, but so many accept what Perlmutter says only because Perlmutter said it! They give him a Nobel Prize and “science” devotees automatically shut down their discriminatory abilities and repeat what they are told to believe! To a large degree, the fraud is working!

    Link to this
  16. 16. curiouswavefunction 4:55 pm 09/13/2012

    To all those who commented: Thanks for your comments, and my apologies for comments not appearing right away. Usually I approve them through an email but this time I didn’t even get an email. I am working with SA staff to make sure comments are instantly approved.

    Link to this
  17. 17. jtdwyer 3:41 pm 09/21/2012

    Regarding the “discovery of dark matter”, as a retired information systems analyst, I address the problem by analyzing the ‘creation’ of dark matter…

    Simply put, discrete objects within the disks of spiral galaxies were found to rotate at roughly the same velocity regardless of their distance from the axis of rotation.

    That contradicted astronomers’ expectations that galaxies would rotate in accordance with Keplerian relations, where each planet’s velocity is determined by its distance from the gravitational pull of the massive Sun (which contains 99.86% of total Solar system mass).

    However, all of the billions of stars and other massive objects in galactic disks each produce their own gravitational field: they are not each independently bound to any central mass; they each gravitationally interact with all other disk masses. Galactic disk objects rotate not as relatively independent planets orbit the Sun, but as loosely bound collections of masses around a common axis.

    It is this simplistic misconception of planetary gravitation imposed on very large scale compound objects composed of billions of independently gravitating discrete masses that seemed to require some ‘missing mass’, or ‘dark matter’. All that was ever really missing is the inclusion of disperse gravitational interactions among the billions of aggregated masses.

    Please see “Inappropriate Application of Kepler’s Empirical Laws of Planetary Motion to Spiral Galaxies Created the Perceived Galaxy Rotation Problem – Thereby Establishing a Galactic Presence for the Elusive, Inferred Dark Matter”,
    http://fqxi.org/community/forum/topic/1419

    While a multitude of modeling studies representing galactic rotational dynamics have been produced that include enormous peripheral masses attributed to some unidentified, undetectable dark matter, the above essay includes references to several studies more accurately representing galaxy dynamics that produce the observed rotational characteristics without relying on any dark matter or modified gravity.

    In addition, a new study further explains the evolution of spiral galaxies without any metaphysical elements. Please see “A New Model Without Dark Matter for the Rotation of Spiral Galaxies: The Connections Among Shape, Kinematics and Evolution”, http://fqxi.org/community/forum/topic/1290#post_63436

    Link to this
  18. 18. jtdwyer 3:46 pm 09/21/2012

    Correction: the more direct link to the last reference above is: http://fqxi.org/community/forum/topic/1290

    Link to this
  19. 19. myvardi 4:04 pm 09/29/2012

    See “Science Has Only Two Legs” http://cacm.acm.org/magazines/2010/9/98038-science-has-only-two-legs/fulltext

    Link to this
  20. 20. AngelaMaddams 11:07 pm 12/10/2012

    Of course the scientific model will continue to gain importance. Computational modelling/simulation is one of the biggest advancements in general science recently. Processing power has increase exponentially in the last two decades.

    Link to this

Add a Comment
You must sign in or register as a ScientificAmerican.com member to submit a comment.

More from Scientific American

Scientific American MIND iPad

Give a Gift & Get a Gift - Free!

Give a 1 year subscription as low as $14.99

Subscribe Now >>

X

Email this Article

X