Confirmations scientifiques 2
H) Billy Meier a depuis longtemps parlé de technologies hybrides mêlant électronique et biologie. Il est certain maintenant que nous en sommes aux débuts de toutes ces technologies et qu'il n'y a plus beaucoup de raisons pour qu'elles ne soient pas développées un jour.Dans le contact 251 :
During this period or about 6 years prior, humans will be converted into machines, that is, robots for the first time by connecting their nervous system to microscopic electronic-biologic gadgetry and machinery that will serve to guide them. This will cause great problems about 85 years later, when the now powerful scientists begin to play 'God', as they had done in earliest times, and they will create new hybrids between humans and animals through genetic alterations. These new 'semi-humans' will declare their solidarity with the robotic humans. But before this transpires, another 80 years will pass after the creation of robotic humans, as I mentioned previously. With the creation of robotic humans, intelligent, biologic-electronic-machinelike robots will be constructed.
source
http://www.futureofmankind.co.uk/Billy_Meier/Contact_Report_251
---->
Les scientifiques viennent de mettre au point un robot "organique" dont le moteur est fait de cellules mais dirigé par des circuits "photoniques".
Swim into the light
A bio-inspired swimming robot that mimics a ray fish can be guided by light. Park et al. built a 1/10th-scale version of a ray fish with a microfabricated gold skeleton and a rubber body powered by rat heart muscle cells. The cardiomyocytes were genetically engineered to respond to light cues, so that the undulatory movements propelling the robot through water would follow a light source.
source
http://science.sciencemag.org/content/353/6295/158
I) Billy Meier a décrit depuis des années des installations de production des Plejarens.
Ils disposent de machines appelées multi-duplicators et qui faisaient passer les récits de Billy Meier pour du délire de science fiction.
Elle servent à produire la viande sans tuer d'animaux mais aussi des machines entières. il y en sur chaque vaisseau et dans les maisons pour usage familial.
Therefore, no one on Earth should be surprised to learn that the Pleiadians/Plejarans produce meat, like on an assembly line, to cover their meat demand without having to slaughter a single animal and deprive its life. Should it nevertheless happen, that that too little, or even no food from animal products are available, then this food can of course be replaced with equivalent substances from vegetable products, or products from the fauna, but not for endless time, because if the substitutes are not again balanced with meat products, little by little damages can occur over time.
Multi-duplicators
As the first and perhaps the most important point to mention is that all production plants, factories and industrial facilities to the Pleiadians/Plejarans is located underground and in infertile regions, whereby no kind of air polluting waste gases and smoke etc. arise. There are no smoking and steaming chimneys etc. polluting the air. All building materials, of any kind, as well as tools, machines, medication, vehicles and in parts food as well, is replicated by multi-duplicators.
That this partly is also done with food products, even though this is also produced in a completely natural ways through garden-, fruit- and farm production, is due to that certain things simply only can be produced in this way, like for example even meat products, because the cultivation of cell-cultures is a long process. By Pleiadians/Plejarans are also robots and androids manufactured by multi-duplicators.
Multi-duplicators are as a rule gigantic apparatuses, colossal copying- and duplicating machines so to speak, through which even small space crafts can be manufactured, resp. duplicated, beside countless other products, in which there are no limits to the amount of variations, and which down to a hair, faithfully replicate everything – not just on the outside, but inside, down to the last atom and molecular structure.
In order to be able to replicate any product, it is necessary to possess an atomic blueprint-template, through which a multi-duplicator is being programmed, which take place in the way the object to be duplicated etc. is <read-in> through a multi-duplicator scanner whereby the duplication then can take place, and in fact in any desired number.
Each multi-duplicator works on an electron-energy basis, whereas electrons are virtually inexhaustible. The duplicator taps the inexhaustible seas of electrons which are present everywhere in the entire universe, whereupon the captured resp. gained electrons are transformed into the required working material, from which the desired products are developed and can be multiplied according to preference.
In addition, the electrons also supply the required propulsion and work energy to these wonder apparatuses. This full automatic production process is operating wise, residing under the responsibility of robots and androids, which also maintain these apparatuses. Consequently in this process, there is really nothing more left over to do for humans than to exercise control functions.
Naturally, Billy adds, there exist not only the enormous multi duplication facilities which is constructed for large scale and mass production, but also smaller apparatuses in this respect, which can be found in every household as well as in every flying craft designed to carry people, even the smallest ones. Of course multi duplicators can also be found in every space craft both in the smallest and, of course also in the largest.
But nevertheless, not all food is simply duplicated through such apparatuses, as already mentioned, but the majority of the food are naturally produced and consumed fresh. However those kinds of natural food are also being preserved in different ways, and are also being processed into dry products.
source
http://www.futureofmankind.co.uk/Billy_Meier/Contact_Report_297
http://www.meiersaken.info/Planet_Erra.html
Geneticists will eventually discover a method that enables plants to produce the animal protein required by human beings. Scientists will finally realize that this process is actually feasible through genetic manipulation. This realization is to be expected very soon, although its enactment will not occur until much later. Blame for this delay can be placed on the anti-genetic-manipulation gripers who, insanely and incomprehensibly, are against gene technology and gene manipulation. Antigripers' efforts must be blamed, therefore, for the burden upon life for long-time mass-breeding,
mass-transports and mass-tortures of billions of animals. The solutions brought about by vegetable-animal protein production and a perfectly acceptable meat substitute could have been effected long before through gene technology to be marketed as food supplies, were it not for the decades of insane anti-griping efforts that impede these actions. The blame for the suffering of many billions of animals must be directed toward these idiotic gripers until they are ultimately forced into silence and kept quiet. Only then will the ills of mass animal breeding and all other related tortures for animals find an end.
source
http://www.futureofmankind.co.uk/Billy_Meier/Contact_Report_251
------->
Production de viande synthétique :
La viande artificielle va-t-elle bientôt s'imposer dans nos assiettes ?
Après avoir financé la création d'une viande de synthèse, Google a récemment tenté de racheter une start-up qui produit du steak à base de végétaux. Une industrie naissante qui pourrait bien changer nos modes de consommation.
Comment la viande artificielle est-elle créée ?
Le steak de synthèse. C'est une recette qui n'ouvre pas forcément l'appétit. Le steak artificiel, conçu par le scientifique Mark Post en 2013, est fabriqué à partir de cellules souches de vache cultivées en laboratoire. 20 000 fibres de muscle ont été nécessaires pour que le tout ressemble à un steak haché traditionnel. Le scientifique, qui s'est mis dans la peau d'un cuisinier, a ajouté une pincée de sel, de la chapelure, de la poudre d'œuf ainsi que du jus de betterave et du safran à sa préparation. Sans ces deux derniers ingrédients, le steak aurait eu un aspect grisâtre, bien loin de la véritable couleur d'un morceau de viande. Au final, le burger artificiel pèse 142 grammes.
La viande à base de végétaux. Fini le steak de tofu, que certains jugent plutôt fade. La start-up Impossible Foods a réussi à développer une viande plus vraie que nature, grâce à des végétaux. Sur son site, l'entreprise ne dévoile rien de sa recette. Même pas quelques ingrédients utilisés pour obtenir son burger 100% végétarien. Un seul petit indice apparaît toutefois sur une page : "Nous avons regardé les produits d'origine animale à l'échelle moléculaire et avons ensuite sélectionné des protéines et des nutriments spécifiques aux graines de plantes afin de recréer l'expérience merveilleusement complexe de la viande et des produits laitiers."
source
http://www.francetvinfo.fr/sante/alimentation/la-viande-artificielle-va-t-elle-bientot-s-imposer-dans-nos-assiettes_1028319.html
Le burger à la viande artificielle emballe la Silicon Valley
source
http://www.lesechos.fr/industrie-services/conso-distribution/0211256978224-la-viande-artificielle-suscite-lengouement-de-la-silicon-valley-2024929.php
http://www.npr.org/sections/thesalt/2016/06/21/482322571/silicon-valley-s-bloody-plant-burger-smells-tastes-and-sizzles-like-meat
Récemment, on a vu la technologie des imprimantes 3D investir de plus en plus de domaines :
Bâtiments, nourriture, organes, moteurs.
La health-tech fascine, et à raison. Ce jeudi, une belle histoire de guérison nous vient de l'Australie, où pour la première fois des vertèbres imprimées en 3D ont permis de soigner un patient.
Nous parlions récemment des premiers organes imprimés. Mais avant qu’ils soient implantés, la médecine utilise déjà des os imprimés. Pourtant aucun médecin n’avait encore réussi à implanter sur un patient des vertèbres cervicales, une zone délicate de l’anatomie humaine.
source
http://www.numerama.com/sciences/148299-des-vertebres-en-impression-3d-implantees-avec-succes-sur-un-malade.html
Ces technologies seront indispensables pour la conquête spatiale. Ce n'est pas pour rien que la NASA a investi dans l'élaboration d'une machine qui imprime des pizzas.
source
http://io9.gizmodo.com/but-how-does-it-taste-watch-nasas-3d-pizza-printer-in-1508295481
https://2015.spaceappschallenge.org/challenge/print-your-own-space-food/
http://www.science-et-vie.com/2016/02/la-construction-de-tissus-humains-par-imprimante-3d-devient-une-realite/
http://www.3dnatives.com/xtreee-btp-impression-3d-27102015/
http://fr.euronews.com/2015/01/21/l-impression-3d-alimentaire-un-technologie-d-avenir-savoureuse/
http://www.lemonde.fr/cosmos/video/2015/12/18/la-nasa-teste-un-moteur-de-fusee-imprime-en-grande-partie-en-3d_4834914_1650695.html
La société israélienne Massivit 3D Printing Technologies profite de la Drupa pour faire la première démonstration publique de sa nouvelle imprimante Massivit 1800. Les visiteurs sont invités à découvrir les nombreuses applications de la technologie 3D et les opportunités de marché qu’elle offre aux fournisseurs de services d’impression.
Cette imprimante 3D grand format ultra-rapide (la plus rapide disponible actuellement sur le marché selon le constructeur) peut produire des pièces de grande qualité jusqu’à 1,5 m x 1,2 m x 1,8 m !
Elle fonctionne grâce à une technologie hybride exclusive baptisée GDP (Gel Dispensing Printing) qui consiste à déposer du filament fondu couche par couche pour fabriquer un objet en 3D. La GDP ajoute également la photopolymérisation des couches imprimées en 3D par un éclairage UV. Résultat ; une imprimante capable d’imprimer jusqu’à 350 mm par heure. Comptez à peu près 5 heures pour imprimer un objet de la taille d’un humain. Elle peut également imprimer deux objets différents simultanément.
source
http://www.graphiline.com/article/23311/drupa-massivit-l-impression-3d-version-xxl?utm_source=Graphiline-Hebdo&utm_medium=email&utm_campaign=newsletter&utm_content=lien-article-texte
J) Dans le contact 212, traduit en 2009 :
Billy Meier parle du noyau terrestre, le comparant à un soleil.
Billy:
Yes, exactly, Ptaah said that, too. But what do you know about global warming? Is this actually to be solely attributed to the environmental degradation, etc. of the person?
Quetzal:
49. Global warming is not just an effect of the environmental and atmospheric pollution caused by the Earth people.
50. The greenhouse effect, in the rotating form of pollution, is also to be attributed to special events and processes that occur on the Sun and whose effects also become noticeable on the Earth.
51. Through this, not only are the atmospheric layers of the Earth warmed up but also the atmospheres of all other planets in the SOL system.
52. Moreover, the Earth, in its interior, generates great heat radiations, which penetrate to the outside and help with the greenhouse effect.
53. The Earth’s core, thus the center of the Earth, is not simply a solid mass, as is erroneously supposed by the earthly scientists; rather, it is a core that is similar to what the Sun is in its entirety.
Billy:
So it’s a bubbling, nuclear furnace.
Quetzal:
54. That is a good comparison.
source
http://futureofmankind.co.uk/Billy_Meier/Contact_Report_212
---->
En 2013, des chercheurs français publient un article montrant que le centre de la terre est probablement un noyau extrêmement chaud, comparable à un petit soleil où se déroulerait des réactions nucléaires :
Actualité Sciences et technos Sciences
La température du centre de la Terre dévoilée
Grâce à un faisceau de rayons X très performant, des chercheurs français sont parvenus à estimer en laboratoire la chaleur du noyau de notre planète.
On en sait désormais un peu plus sur la fournaise qui règne au centre de la Terre. Quelque part entre 3 000 et 5 000 kilomètres sous nos pieds, notre planète possède un noyau liquide, essentiellement fait de fer en fusion, au coeur duquel se cache une "graine" solide qui grossit à mesure que celui-ci refroidit. La température qui y règne intrigue depuis longtemps les scientifiques, au point que certains ont été jusqu'à envisager de forer au plus profond du manteau terrestre. Jusqu'ici, toutes les simulations faites en laboratoire restaient inconciliables avec les calculs théoriques, de sorte que le doute continuait de planer. Mais, grâce à la performance du faisceau de rayons X du Synchrotron européen de Grenoble (ESRF), le plus brillant au monde, une équipe française de chercheurs est enfin parvenue à la réévaluer. Et ses résultats se révèlent conformes aux prédictions théoriques. Au centre de la Terre, il ferait ainsi, selon la profondeur, entre 3 800 et 5 500 °C.
…
Un chiffre qui, compte tenu des autres éléments présents dans le noyau (soufre, silicium, carbone, etc.), leur a permis d'évaluer que la température moyenne au centre de la Terre était comprise entre 3 800 °C, à la limite entre le manteau et le noyau, et 5 500 °C, à la frontière du noyau liquide et de la graine. Une fournaise qui, d'après leurs calculs, générerait un flux de chaleur d'environ 10 térawatts, soit l'équivalent de la production de 40 000 centrales nucléaires.
Or, cette chaleur du noyau planétaire est essentielle dans la mesure où elle influence les mouvements du manteau terrestre, responsables de la tectonique des plaques, aussi bien que l'activité volcanique. De plus, c'est également elle, en plus de celle produite par le manteau, qui permet d'entretenir le champ magnétique qui entoure notre planète et lui confère une sorte de bouclier capable de dévier les particules mortelles du vent solaire. Sans lui, il y a donc fort à parier que la vie ne se serait jamais apparue sur la Terre.
source
http://www.lepoint.fr/science/la-temperature-du-centre-de-la-terre-devoilee-03-05-2013-1663083_25.php
Melting of Iron at Earth’s Inner Core Boundary Based on Fast X-ray DiffractionS. Anzellini, A. Dewaele, M. Mezouar, P. Loubeyre, G. Morard
source
http://science.sciencemag.org/content/340/6131/464.full
A new study in Science suggests that the temperature of our planet's core is much, much hotter than previously thought -- 6,000 degrees Kelvin, compared with earlier estimations that were closer to 5,000 degrees Kelvin. This temperature, blazing hot to a degree beyond comprehension, is the same as that of the surface of the sun. Yes, you read that right: the core of our temperate little atmospherically-protected home is as hot as *a star*.
…
Now that is crazy, crazy hot. And there's no doubt that it is absolutely wild to imagine that deep beneath your feet is a center of molten iron as hot as a star. But before you get too bowled over -- our Earth has a star inside! -- there's something you should know: the surface of the sun is weirdly cool, at least when compared with the sun's atmosphere, known as the corona. There, the temperature reaches *1 to 2 million* degrees Kelvin. 6,000 degrees Kelvin suddenly seems practically balmy, doesn't it?
How is it that the sun's corona reaches such a high temperature, while its surface remains so much cooler?
The answer is ... we don't quite know...
source
http://www.theatlantic.com/technology/archive/2013/04/the-earths-core-is-as-hot-as-the-surface-of-the-sun/275346/
k) Billy Meier explique dans le contact 215 la disparition des mammouths. Ils auraient été éliminés par les hommes qui appréciaient leur viande et en même temps par des bouleversement climatiques. Il note que la plupart des mammouths ont disparu il y a 10 000 ans mais que quelques mammouths ont pu survivre jusqu’à il y a 3500 ans sur une ile au nord de l’union soviétique. L’éradication finale a été faite selon lui par un changement de climat.
66. Only a few thousand years ago, this giant sloth lived in North and South America, but it was hunted by humans for its great-tasting meat until it was wiped out, as this also happened with the mammoth, which was also largely eliminated by the humans at that time, while the remaining animals found their end through climatic upheavals and natural disasters.
…
Billy:
You said that along with the fact that humans have contributed to the disappearance of the mammoths from the Earth, in the end, also climate changes were responsible for this.
Quetzal:
126. That’s right.
127. Humans very well eradicated the mammoth as far as possible, but the final extinction was caused by climate changes.
Billy:
The mammoth disappeared, yes, about 10,000 years ago.
Quetzal:
128. That only partially represents the truth because the last of these animals still lived 3,500 years ago, on an island to the far north of today's Soviet Union.
129. However, it is true that about 10,000 years ago, mammoths were very strongly reduced by humans, who hunted them very much for their meat, hides, and bones.
130. The meat was used as tasty food, while the hides and the bones found use for the building of huts.
source
http://www.theyfly.com/articles/WILL_HUMANITY_WAKE_UP.html
------->
Michael Horn note que l’on a bien découvert que les mammouths furent certainement victimes du changement climatique en simulant la vegetation en fonction du climat.
Some scientists have argued that it was principally the result of climate change while others say that it was driven by pressures of a growing human population, or even a cataclysmic meteor strike.
Now, according to Professor Brian Huntley of Durham University, that debate has been settled.
"What our results have suggested is that the changing climate, through the effect it had on vegetation, was the key thing that caused the reduction in the population and ultimate extinction of mammoths and many other large herbivores," he said.
Professor Huntley and his colleagues created a computer simulation of vegetation in Europe, Asia and North America over the last 42,000 years.
source
http://www.bbc.com/news/science-environment-11000635
Il est à noter qu’on sait aussi que les plus récentes traces de Mammouths ont été découverts sur l’ile de Wrangle et sont datés d’il y a 3600 ans
source
http://io9.gizmodo.com/5896262/the-last-mammoths-died-out-just-3600-years-agobut-they-should-have-survived
Dans l’article daté de 2012, les chercheurs se demandent de quoi les derniers mammouths ont été victimes. Ils explorent l’hypothèse de la dégénérescence
génétique mais elle ne fonctionne pas, ils pointent plutôt sur une cause humaine ou climatique.
Instead, Dalen and the rest of the team believes some drastic change must have occurred on Wrangel Island to kill off the mammoths, and there are two likely culprits: humans and climate. Archaeological evidence suggests that humans reached Wrangel Island at roughly the same time the last mammoths vanished, but there's no evidence yet to indicate that they ever hunted the mammoths. The more likely answer is climate change, which as a side effect might well have made it easier for humans to reach the island to serve as witnesses to the mammoths' final days.
Un article daté d’août 2016 semble donner l’explication : les mammouths seraient morts de soif, en raison d’une montée des eaux, l’accès à l’eau potable aurait été réduite et les animaux auraient épuisé et souillé les points d’eau qui restaient accessibles. C’est donc bien un changement climatique qui a provoqué l’extinction finale des mammouths.
Scientists believe that human hunting and environmental changes played a role in their extinction.
But the group living on St Paul Island, which is located in the Bering Sea, managed to cling on for another 5,000 years.
This study in the Proceedings of the National Academy of Sciences suggests that these animals faced a different threat from their mainland cousins.
…
As the Earth warmed up after the Ice Age, sea levels rose, causing the mammoths' island home to shrink in size.
This meant that some lakes were lost to the ocean, and as salt water flowed into the remaining reservoirs, freshwater diminished further.
The fur-covered giants were forced to share the ever-scarcer watering holes. But their over-use also caused a major problem.
Lead author Prof Russell Graham, from Pennsylvania State University, said: "As the other lakes dried up, the animals congregated around the water holes.
"They were milling around, which would destroy the vegetation - we see this with modern elephants.
"And this allows for the erosion of sediments to go into the lake, which is creating less and less fresh water.
"The mammoths were contributing to their own demise."
This study highlights that small populations are very sensitive to changes in the environment
He said that if there was not enough rain or melting snow to top the lakes up, the animals may have died very quickly.
"We do know modern elephants require between 70 and 200 litres of water daily," Prof Graham said.
"We assume mammoths did the same thing. It wouldn't have taken long if the water hole had dried up. If it had only dried up for a month, it could have been fatal."
The researchers say climate change happening today could have a similar impact on small islands, with a threat to freshwater putting both animals and humans at risk.
'Best understood extinction'
Commenting on the study, Love Dalen, professor in evolutionary genetics at the Swedish Museum of Natural History, said: "With this paper, the St Paul Island mammoth population likely represents the most well-described and best understood prehistoric extinction events.
"In a broader perspective, this study highlights that small populations are very sensitive to changes in the environment."
source
http://www.bbc.com/news/science-environment-36945909
l) Dans l’un des tous premiers contacts , le contact N°4, Semjase explique à Billy Meier que les vaisseaux utilisent deux systèmes de propulsion, l’un basé sur la lumière pour les voyages interplanétaires (« courtes » distances dans un système solaire par exemple), l’autre basé sur les tachyons pour se déplacer plus vite que la lumière (voyage interstellaire).
Billy
My next question refers to what you have already explained during my — that is, “our” - first meeting (contact 1). People on Earth will never be capable of travelling into the true, deep outer space unless they invent another method of propulsion. I can only imagine what you mean with the term propulsion, e.g. that it must involve a form of beam drive - a hyper-drive, so to speak. In my opinion it would need to consist of a drive that alters matter in some way, probably while the speed of light is exceeded. In the process, the beamship is hurled into hyperspace, in which space and time are paralysed, as you have already explained. I assume that space and time collapse in a manner whereby they are somehow completely nullified.
Semjase
30. You would make a great scientist.
31. This is really phenomenal considering that all of your knowledge is based on autodidactic work.
32. You are completely correct in your assumptions.
33. To travel through real outer space, one needs a drive that surpasses the speed of light many times over.
34. This propulsion can only become activated, however, when the speed of light has already been reached.
35. As a result, another drive is needed to regulate the normal speed up to that of light.
36. This means then that a beamship needs two propulsion systems: first, a normal drive which permits acceleration up to and below the speed of light and, second, a hyperdrive as you call it.
37. A drive, therefore, which generates a velocity a million and billion times that of light; the hyperspeed, which enables us to enter hyperspace.
38. A space in which every mass expands in proportion to the increase in speed.
39. Consequently, time and space collapse and they become null-time and null-space.
40. That is to say:
41. Space and time simply cease to exist.
42. And exactly by this manner is created the fact that distances of countless light-years can be traversed in a fraction of a second without causing a shift in time.
source
http://www.futureofmankind.co.uk/Billy_Meier/Beamship
------->
Une publication datée du 17 novembre 2016 de scientifiques de la NASA vient d’être publiée montrant que le moteur de propulsion EMDdrive utilisant l’électromagnétisme est bien une réalité. Le problème est que si cela s’avère, cela défierait les lois de la physique telles que nous les connaissons.
Le moteur EM Drive de la Nasa gagne en crédibilité
Le moteur électromagnétique de la Nasa vient d'être présenté dans une revue scientifique. Une première étape pour ce concept de propulsion spatiale qui a longtemps été assimilé à un délire de science-fiction.
PROPULSION. Après avoir suscité pendant des années des critiques, oscillant entre ricanement et prudence, le fameux et fumeux moteur électromagnétique de la Nasa EM Drive a marqué un point : une présentation de ce procédé de propulsion vient de faire l’objet d’un article publié dans une revue spécialisée. Cette « relecture par les pairs » est la première condition pour qu’un travail soit qualifié de scientifique. Un premier succès pour l’EM Drive, que les spécialistes doivent désormais réexaminer avec un regard plus sérieux. La revue Journal of propulsion and power montre donc que ce moteur critiqué pendant très longtemps comme contradictoire avec la physique de base, pourrait fonctionner et parvenir même à la planète Mars en 70 jours, le tout sans combustible !
Transformer un courant électrique en faisceau micro-onde
L’EM Drive utilise le principe du magnétron, un dispositif qui transforme un courant électrique en faisceau micro-onde. Ce qui fait office de tuyère est en forme d’un cône tronqué au sein duquel règne un vide: les micro-ondes font des allers retours entre les deux parois. En théorie, ils devraient y avoir plus d’impacts sur une des deux parois ce qui devrait créer une poussée minime … six fois moindre que la propulsion photonique due aux voiles solaires par exemple. Une poussée bien faible que les détracteurs de l’engin ont souvent assimilé aux erreurs de mesures…
Le moteur électromagnétique est issu du travail acharné de l’équipe d’Eagleworks laboratory de la Nasa qui étudie les solutions les plus folles pour la propulsion. C’est cette même équipe qui a proposé le Warp Drive – ou le moteur d’Alcubierre – du nom de Miguel Alcubierre, un physicien théoricien qui a étudié l’idée. Il s’agit d’un système qui modifie la géométrie de l’espace-temps pour avancer … inspiré des films de science-fiction, entre autres la série Star Trek.
source
http://www.sciencesetavenir.fr/fondamental/le-moteur-em-drive-de-la-nasa-passe-une-etape-decisive_108338
A vacuum test campaign evaluating the impulsive thrust performance of a tapered radio-frequency test article excited in the transverse magnitude 212 mode at 1937 MHz has been completed. The test campaign consisted of a forward thrust phase and reverse thrust phase at less than 8×10−6torr vacuum with power scans at 40, 60, and 80 W. The test campaign included a null thrust test effort to identify any mundane sources of impulsive thrust; however, none were identified. Thrust data from forward, reverse, and null suggested that the system was consistently performing with a thrust-to-power ratio of 1.2±0.1mN/kW
source
http://arc.aiaa.org/doi/10.2514/1.B36120
But if the EMdrive is truly reactionless, then Newton is wrong. Also, Einstein is wrong, Maxwell is wrong and all of quantum physics is wrong. There’s a fundamental symmetry that causes momentum conservation: translational symmetry. It means that if my system is over here, at a certain point in space, it should obey the same laws as if it’s over there, at a different point in space. But if momentum conservation isn’t truly fundamental, then translational symmetry cannot be a good symmetry of the Universe. In other words, there must be a preferred location, where the laws of physics are different in one location than others. The laws of physics, all of a sudden, depend on position.
…
The point isn’t that physics is wrong, nor is the point that the Eagleworks team is wrong. The point is that this is the beginning stages of actual science being done to examine an effect. The most likely outcome is that momentum really is conserved and there’s something funny going on here. For faster-than-light neutrinos, it was a loose cable. For the BICEP2 results, it was an incorrect calibration of galactic gas. For cold fusion, it was a poor experimental setup, and for perpetual motion, it was a scam. No matter what the outcome, there’s something to be learned from further investigation. Whether it’s new physics and a new type of engine results, or whether it’s simpler than that and the effect’s cause simply hasn’t been determined yet, more and better experiments will be the ultimate arbiter. This is why we do the science in the first place.
source
http://www.forbes.com/sites/startswithabang/2016/11/23/how-physics-falls-apart-if-the-emdrive-works/#2c63b6554b0c
m) Selon Billy Meier, la vitesse de la lumière n’est pas constante et qu'elle va en décroissant. Selon Ptaah, la théorie de la relativité d’Einstein subira plusieurs changements.
Q- Ptaah told you once, that Plejarans don't use a light years unit to measure distance, because it is our unique invention and completely wrong, because light in space don't always travel with constant velocity of "c". Yet in modern physics the speed of light in vacuum is "a holy constant" and always equal to "c". The constant speed of light for each observer is also a cornerstone of Einstein Theory Of Relativity. My question: Was Einstein wrong? Is his Theory of Relativity based on false assumptions?
A- Yes, Einstein was wrong. The speed of light is not constant throughout the duration of the existence of our universe. It gradually slows down. There exists a half-life (Halbwertszeit).
For us at the present time the speed of light is (appears) constant.
CR 251
Albert Einstein's theory on relativity will undergo several additional modifications.
source
http://www.futureofmankind.co.uk/Billy_Meier/Contact_Report_251
CR 119 :
Billy:
Then just not. But is it right, now, if I have calculated that the Creation's expansion rate, for the initial period, was 44,069,497.5 kilometers per second, with a steadily constant half-life rate of almost exactly 6,347,755,102,040 years, from which the results arise that the expansion rate of the Creation at its universal beginning was 147 times the speed of today's speed of light constant, but this speed decreased with a half-life of 6,347,755,102,040 years and continues to decrease, so the starting point of today's light constant lay at a speed of 344,292.9 kilometers per second, but through the already elapsed portion of half-time, it has already dropped by 44,500.4 kilometers per second, whereby the present and current light constant of 299,792.5 kilometers per second arises, according to which an original light year, from the starting point of the current light constant, of around 1.390 X 10^15 km has amounted.
This means, according to my calculations, that the constant of one second of the initial period of the expansion rate must have been 147 times faster than the constant of one second today, because at that time, around 46 trillion years ago, the speed of light was even 44,069,497.5 kilometers per second.
source
http://www.billymeiertranslations.com/pdf/Plejaren%20Contact%20Reports%20Vol.%203.pdf
---------->
Des scientifiques élaborent une théorie qui pourraient être vérifiée dans le futur se basent sur la variabilité de la vitesse de la lumière, celle-ci étant plus grande à la naissance de l’univers. Cela pourraient modifier la théorie de la gravité d’Einstein.
RADICAL IDEA
Professor Magueijo said: "The theory, which we first proposed in the late-1990s, has now reached a maturity point – it has produced a testable prediction. If observations in the near future do find this number to be accurate, it could lead to a modification of Einstein's theory of gravity.
"The idea that the speed of light could be variable was radical when first proposed, but with a numerical prediction, it becomes something physicists can actually test. If true, it would mean that the laws of nature were not always the same as they are today."
The testability of the varying speed of light theory sets it apart from the more mainstream rival theory: inflation. Inflation says that the early universe went through an extremely rapid expansion phase, much faster than the current rate of expansion of the universe.
…
The varying speed of light theory suggests that the speed of light was much higher in the early universe, allowing the distant edges to be connected as the universe expanded. The speed of light would have then dropped in a predictable way as the density of the universe changed. This variability led the team to the prediction published today.
source
http://phys.org/news/2016-11-theory-einstein-physics.html
n) Billy Meier a décrit il y a des années les dangers des voyages dans l’espace, notamment des radiations et de la micro gravité sur le cerveau évoquant les problèmes de ralentissement, d’inflammation , d’atrophie du cerveau, et de démence à long terme voire de la mort.
275. The free space hides many dangers in itself, which the Earth person is still in no way aware of.
276. In particular, the very dangerous kind of space-conquering, as is pursued by the Earth people, releases damage to the health in the person.
277. First and foremost, the Earth people have no knowledge about the hazardous, body-damaging, as well as organ-damaging radiations, which prevail in all of space and pass through it.
278. On the other hand, the realization also escapes them that the human body cannot cope with weightlessness on a permanent basis, which is why it already begins to take on physical and organic damage after seventy hours of a weightless state.
279. If the Earth person, as well as any other race to be classified as space mastering, wants to pursue space flight, then the spacecraft equipment must be adapted to the given conditions in all respects, as well as the space suits for the life forms themselves.
280. Space flight equipment and space suits must be safeguarded and made resistent by a special insulation shielding layer with regard to the various body-damaging and organ-damaging space radiations and space vibrations.
281. This is the most important factor for the preservation of life in space for the human, and so it is also the most important factor for the conservation of organic and physical health of the people who move through space with space-competent missiles or in protective suits.
282. The second and equally important factor in this regard is based on gravity and is to be observed with equally great importance as with the shielding against the space radiations and space vibrations.
283. If these factors are not taken into account, and thus, the spacecraft and protective suits of the people are not prepared accordingly in a way that the missiles and protective suits are made resistant against the outside influences of the radiations and vibrations by special insulation shields and that the spacecraft and protective suits are equipped with their own gravitational fields, then the body and all organs and bones of the space traveling people will take damage to the health.
284. Radiations, vibrations and the sorts, unprotected flying objects and also such overalls, as well as the weightlessness of interstellar space lead, in the very first place, to health damages in the brain and in the bones of humans and many other life forms.
285. These, together with many other forms of injury to health, which spread to the whole body and to all organs.
286. Thus, if the human life form is not protected in space by special shields and by artificial gravitational fields against the space radiations and space vibrations and against the weightlessness, then he will suffer health damage, which, in a stark case, usually leads to death.
287. The first severe reaction of the brain injury that I mentioned, for example, leads to barely detectable brain swelling in very minor cases, which will, after some time, lead to thought and action uncertainty and then inevitably result in reaction loss, such as, for example, the sudden loss of control of a vehicle or aircraft or the appearance of completely faulty actions against all reason.
288. These kinds of minor cases already occur with those people who, even on the Earth, linger in containers where weightlessness is produced, but on the other hand, they also appear in all those Earth people who leave the Earth for only a very short duration and get out above the earthly ozone layer.
289. Truly, all of that may only be done then, when the necessary precautions are sufficient enough; otherwise, the health damages are inevitable.
290. However, if a human or any other life form lingers for a very long time, such as many months or years, unprotected in weightlessness in space, then the initially developing brain swelling of an inflamed form will suddenly develop in reverse sequence, by which brain atrophy then develops, as with weak-thinking and elderly people.
291. Even the brain substance itself suffers a loss; thus, the entire brain mass passes through this phenomenon of a pathological nature.
292. This symptom of illness, and it evidently deals with such, is caused on the one hand by the uninhibited influence of space vibrations and space radiations of various kinds, as well as by weightlessness.
293. The illness originating from these factors inflames the brain substances and the brain itself, after which a new illness factor rapidly arises, which expresses itself as a decrease of brain activity, through a kind of palsy of cerebral substance, which then leads to the general shrinkage of total brain mass, which can no longer be stopped by human and medical and other similar means.
294. If the person lingers long enough unprotected and weightlessly in space, then the brain contraction ultimately leads to the point where the person loses the absolute control over himself, his thinking, and actions and life.
295. The ultimate end, then, is insanity and death.
Billy
Exactly; that is what you explained to me back then, but how long will it still take before the people of the Earth recognize the first facts of these matters?
Quetzal
296. It will be the time around the middle of 1982.
297. But in truth, only a few facts will be fathomed initially, while the final or, at least, the further scope of the effective space threats will be recognised only much later, after the initial space flights have already claimed the lives of Earth people.
source
http://futureofmankind.co.uk/Billy_Meier/Contact_Report_150
————>
Ce n’est que récemment que des scientifiques se penchent sur la question en utilisant des IRM et en effectuant des simulations sur des souris.
Will astronauts traveling to Mars remember much of it? That's the question concerning University of California, Irvine scientists probing a phenomenon called "space brain."
UCI's Charles Limoli and colleagues found that exposure to highly energetic charged particles -- much like those found in the galactic cosmic rays that will bombard astronauts during extended spaceflights -- causes significant long-term brain damage in test rodents, resulting in cognitive impairments and dementia.
Their study appears in Nature's Scientific Reports. It follows one last year showing somewhat shorter-term brain effects of galactic cosmic rays. The current findings, Limoli said, raise much greater alarm.
"This is not positive news for astronauts deployed on a two-to-three-year round trip to Mars," said the professor of radiation oncology in UCI's School of Medicine. "The space environment poses unique hazards to astronauts. Exposure to these particles can lead to a range of potential central nervous system complications that can occur during and persist long after actual space travel -- such as various performance decrements, memory deficits, anxiety, depression and impaired decision-making. Many of these adverse consequences to cognition may continue and progress throughout life."
For the study, rodents were subjected to charged particle irradiation (fully ionized oxygen and titanium) at the NASA Space Radiation Laboratory at New York's Brookhaven National Laboratory and then sent to Limoli's UCI lab.
Six months after exposure, the researchers still found significant levels of brain inflammation and damage to neurons. Imaging revealed that the brain's neural network was impaired through the reduction of dendrites and spines on these neurons, which disrupts the transmission of signals among brain cells. These deficiencies were parallel to poor performance on behavioral tasks designed to test learning and memory.
In addition, the Limoli team discovered that the radiation affected "fear extinction," an active process in which the brain suppresses prior unpleasant and stressful associations, as when someone who nearly drowned learns to enjoy water again.
"Deficits in fear extinction could make you prone to anxiety," Limoli said, "which could become problematic over the course of a three-year trip to and from Mars."
Most notably, he said, these six-month results mirror the six-week post-irradiation findings of a 2015 study he conducted that appeared in the May issue of Science Advances.
...
While dementia-like deficits in astronauts would take months to manifest, he said, the time required for a mission to Mars is sufficient for such impairments to develop. People working for extended periods on the International Space Station, however, do not face the same level of bombardment with galactic cosmic rays because they are still within the Earth's protective magnetosphere.
Limoli's work is part of NASA's Human Research Program. Investigating how space radiation affects astronauts and learning ways to mitigate those effects are critical to further human exploration of space, and NASA needs to consider these risks as it plans for missions to Mars and beyond.
Partial solutions are being explored, Limoli noted. Spacecraft could be designed to include areas of increased shielding, such as those used for rest and sleep. However, these highly energetic charged particles will traverse the ship nonetheless, he added, "and there is really no escaping them."
Preventive treatments offer some hope. Limoli's group is working on pharmacological strategies involving compounds that scavenge free radicals and protect neurotransmission.
source
https://www.sciencedaily.com/releases/2016/10/161010052832.htm
2015 study :
As NASA prepares for the first manned spaceflight to Mars, questions have surfaced concerning the potential for increased risks associated with exposure to the spectrum of highly energetic nuclei that comprise galactic cosmic rays. Animal models have revealed an unexpected sensitivity of mature neurons in the brain to charged particles found in space. Astronaut autonomy during long-term space travel is particularly critical as is the need to properly manage planned and unanticipated events, activities that could be compromised by accumulating particle traversals through the brain. Using mice subjected to space-relevant fluences of charged particles, we show significant cortical- and hippocampal-based performance decrements 6 weeks after acute exposure. Animals manifesting cognitive decrements exhibited marked and persistent radiation-induced reductions in dendritic complexity and spine density along medial prefrontal cortical neurons known to mediate neurotransmission specifically interrogated by our behavioral tasks. Significant increases in postsynaptic density protein 95 (PSD-95) revealed major radiation-induced alterations in synaptic integrity. Impaired behavioral performance of individual animals correlated significantly with reduced spine density and trended with increased synaptic puncta, thereby providing quantitative measures of risk for developing cognitive decrements. Our data indicate an unexpected and unique susceptibility of the central nervous system to space radiation exposure, and argue that the underlying radiation sensitivity of delicate neuronal structure may well predispose astronauts to unintended mission-critical performance decrements and/or longer-term neurocognitive sequelae.
source
http://advances.sciencemag.org/content/1/4/e1400256
So far, they’ve found that a microgravity environment can lead to changes in brain structure and take a serious toll on astronauts’ ability to think. The astronauts have had a more difficult time completing mental tasks and with physical coordination during and after spending time aboard the ISS.
source
http://www.huffingtonpost.com/entry/space-astronauts-brain_us_563cb57ce4b0307f2cad0950
To date, hampered physiological function after exposure to microgravity has been primarily attributed to deprived peripheral neuro-sensory systems. For the first time, this study elucidates alterations in human brain function after long-duration spaceflight. More specifically, we found significant differences in resting-state functional connectivity between motor cortex and cerebellum, as well as changes within the default mode network. In addition, the cosmonaut showed changes in the supplementary motor areas during a motor imagery task. These results highlight the underlying neural basis for the observed physiological deconditioning due to spaceflight and are relevant for future interplanetary missions and vestibular patients.
source
http://link.springer.com/article/10.1007%2Fs00429-015-1054-3