close

Science and Technology

The Earth isn’t as bright as it once was

 

A new finding that seems a little out of this world is a new reality for researchers who have been studying the cosmos for the last two decades.

They discovered the Earth is not as bright as it once was and has been dimming at a noticeable pace in the last few years.

Using a telescope which does not look much different from the one you could have in your home, researchers from the Big Bear Solar Observatory have been taking measurements each night for the last 20 years, to study the sun’s solar cycle and cloud cover.

They did this by measuring the “earthshine,” which occurs when “the dark face of the Moon catches Earth’s reflected glow and returns that light,” according to NASA. The amount of earthshine will vary from night to night and season to season.

“You look at a quarter moon. You can see the whole moon because three quarters of it is illuminated in this ghostly light,” said Philip Goode, a researcher at New Jersey Institute of Technology and the lead author of the new study.

After 20 years of measuring the ‘ghostly light,’ they found it to be fading.

“It’s actually the sunlight reflected off the earth, and that’s what’s getting dimmer,” said Goode.

In fact, the Earth is now reflecting about half a watt less light per square meter than it was 20 years ago, the equivalent of a 0.5% decrease in the Earth’s reflectance. Earth reflects about 30% of the sunlight that shines on it.

“A lot of these things, you can surrender your common sense at the door and there are a lot of surprises,” Goode said. “These are one of those surprises.”

For the first 17 years, the data looked more or less the same, to the point where the researchers almost called off the rest of the study.

“We were sort of reluctant to do the last three years of data because it looked the same for 17 years, but finally we decided to do it because we promised ourselves 20 years of data so let’s just do this, and we got the unexpected,” said Goode.

In a shocking turn of events, the last three years of their study showed the earthshine had gone down dramatically. So much, they thought their data was flawed.

“When we analyzed the last three years of data, it looked different,” Goode said. “The reflectance had gone down and had gone down noticeably. So we thought we had done something wrong. So we redid it several times and it turns out it was correct.”

They noticed the data didn’t correlate with the varying brightness of the sun due to its solar cycles, which meant the cause had to be something else.

What they noticed was a decrease in cloud cover. The sunlight is bounced off the cloud tops and reflected back into space. When there is a decrease in cloud cover, more sunlight is allowed in.

“The earth is getting more heat because the reflected light is being reduced, so it’s getting more sunlight coming in, in the visible spectrum,” said Goode.

The biggest decrease in cloud cover was over the west coasts of North and South America, the same region where the sea surface temperatures have been rising due to a reversal of a climatic condition called the Pacific Decadal Oscillation (PDO).

The PDO is a term used to refer to the long-term ocean fluctuations in temperature across the Pacific Ocean. As the ocean warms and cools in different places, it has a direct impact on the path of the jet stream. This shifting of the jet stream has a direct impact on weather conditions and long-term climate, especially across the western coasts of North and South America.

“Off the west coast of the Americas, the low-lying clouds were burned away and more sunlight came in, so the way we saw it was, the reflectance of the Earth had dropped,” said Goode.

Goode stopped short of saying it would have a direct impact on the Earth warming faster. “Sure, the Earth is getting an extra half a watt per square meter, but what the Earth chooses to do with this energy, we would be sorta guessing.”

 

read more

Scientists find evidence of humans making clothes 120,000 years ago

By Brian Handwerk

 

 

Fur is a controversial fashion statement these days. But stepping out in a wildcat cape or jackal wrap was de rigueur for Pleistocene humans, according to the recent discovery of a 120,000-year-old leather and fur production site that contains some of the oldest archaeological evidence for human clothing.

Homo sapiens at the site first made and wore clothes around the onset of an Ice Age which may suggest that, even in relatively mild Morocco, clothes were adopted as a way to keep warm. But the invention of animal-based apparel also corresponds with the appearance of personal adornments, like shell beads, which hints that prehistoric clothing, like today’s styles, could have been about style as well as functionality.

Emily Hallett, of the Max Planck Institute for the Science of Human History in Germany, didn’t set out to investigate where and when humans started wearing clothes, which decompose and vanish after a few thousand years at most. Initially interested in diet, she was examining bones to see which animals Pleistocene humans ate, and how they butchered them, in Contrebandiers Cave on Morocco’s Atlantic Coast.

But Hallett found bones she wasn’t expecting: dozens of tools carefully shaped, smoothed and polished into implements ideal for scraping hides clean to make leather, and scraping pelts to produce furs. “They look like the tools that people still use today to process hides for leather and fur,” Hallett says, noting that similar tools have also been found associated with the same tasks in far younger archaeological sites. Hallett, who co-authored a study on the findings in the September 16 issue of the journal iScience, worked with a team that included the late Harold Dibble, an influential archaeologist from the University of Pennsylvania.

The researchers found 62 different bone tools in Middle Stone Age layers dated to 90,000 to 120,000 years ago. Despite their age the implements represent relatively specialized instruments for the tasks at hand, which suggests that humans first started using more crude versions of such implements to process fur and skins at an even earlier date.

Oddly a single marine mammal tooth was also found in the cave, dated to about 113,000 years ago, which represents a first for Pleistocene archaeological sites in North Africa. Future molecular analysis should identify the species but the shape strongly suggests that it’s from an ancient sperm whale. Signs of wear on the tooth might have happened while the animal was alive, but it might have also been used as some type of flaking tool, used to sharpen another tool’s edge by applying careful pressure.

But the bone tools tell only half of the story. Hallett also noticed that a lot of carnivore bones piled in the cave still bore the telltale marks of being cut by humans.

The remains of sand foxes, golden jackals and wildcats clearly showed marks like those still created in skinning techniques. Incisions were made to detach the skin at each of the animal’s four paws, so that the skin could be pulled in one piece to the animal’s head. Skin at the head was then removed by cutting around the lips, which is also evidenced by ancient cut marks. These carnivore species show no marks of butchery that would suggest they were eaten, only the cuts necessary to remove skin. On the other hand, the remains of other animals including bovids akin to ancient cows, show clear signs that they were processed to produce meat for the Pleistocene dinner table.

“Once those two pieces were there, bone tools used to prepare leather and fur and carnivore bones that have marks for fur removal, we put that together and realized that it’s most likely this was evidence for the making of clothing,” notes Hallett.

The evidence suggests that North African cave dwellers were making and wearing clothing long before the great migrations of humans to which all living non-Africans can trace their roots. When those Homo sapiens left Africa to populate the corners of the globe, it appears that they likely did so adorned in an array of animal skins and furs.

The reason why our ancestors began creating those clothes in the first place may be more complex than it appears at first glance. It’s often theorized that many human cognitive and evolutionary leaps were born of necessity—adapt or die. Early modern humans and Neanderthals needed, and seem to have produced, clothing to survive in colder times and places like Ice Age Europe (15,000 to 70,000 years ago).

But the climate around Contrebandiers Cave in Morocco was relatively mild 100,000 years ago, as it remains today. That’s led some, including Hallett, to suggest that clothing might not have been needed for survival. But Ian Gilligan, author of Climate, Clothing and Agriculture in Prehistory, says Northern Africa can be surprisingly cold at times even in warmer eras, so that cold snaps and conditions like hypothermia would have presented a definite threat. Humans might well have adopted clothing for comfort against the chill even when conditions were not extreme, adds Gilligan, an archaeologist at the University of Sydney who was not involved with the study.

“This new study really pushes back the first good archaeological evidence for the manufacture of clothing, and it’s coinciding nicely with the beginning of the last Ice Age about 120,000 years ago, so I think that’s really significant,” Gilligan says. “It’s precisely at the time when you’d expect to see the first clothing for protection from cold in context of the glacial cycles.”

The earliest previous technological evidence for clothing didn’t appear until about 75,000 years ago, in Southern African sites like Blombos Cave and Sibudu Cave. There scientists found the first confirmed bone awls, with microwear on the tips suggesting they were used hide-piercing to sew garments, together with hide-cutting stone blade tools and hide-scrapers. (Some much older sites have tools that suggest human relatives could have worn clothes hundreds of thousands of years ago, but the evidence is far less certain.)

The onset of colder climate isn’t the only interesting development that corresponds with the creation of clothes in Africa. In that period of time personal ornaments appeared in the lives of Pleistocene humans. Contrebandiers Cave, for example, is littered with tiny shells that could have produced no nutritional benefit but may have been valued for other reasons.

“Some of them are pierced, and they show up all over Africa around this time,” Hallett explains. “Most archaeologists believe this is personal ornamentation, a form of symbolic expression, and it’s interesting that this evidence for clothing shows up at the same time in these mild habitats.”

The world’s oldest surviving clothing hasn’t lasted nearly as long as shells or beads. The world’s oldest known shoes, bark sandals, were stashed in a central Oregon cave some 9,000 or 10,000 years ago. Some of the oldest extant clothes were found on the famous mummy Ötzi some 5,000 years ago. By that same time Egyptians were producing fine linens as evidenced by the Tarkhan dress, the world’s oldest woven garment.

While scientists say it’s extremely unlikely that skins or fur could ever be found preserved from the far more ancient eras when humans first started wearing them, another line of indirect evidence seems to dovetail nicely with the archaeological findings at Contrebandiers. “Human lice have evolved in tandem with their hosts…

read more

Samsung Is Building a $17 Billion Chip Factory Right On Tesla’s Doorstep

 

As the race for electric supremacy heats up, manufacturers across the globe are scrambling to secure resources and establish reliable supply chains. As one of the biggest players in the EV market, Tesla requires massive amounts of hardware that it can’t produce itself, and more specifically, semiconductor chips. The auto manufacturing world has been severely crippled by the recent chip shortages, and Tesla has not escaped the drought unharmed. Upcoming models such as the highly anticipated Tesla Roadster and Cybertruck have been continuously delayed, but there seems to be a light at the end of the tunnel: according to sources, Samsung is planning a $17 billion chip factory a few miles from Tesla’s Giga factory in the state of Texas.

The Korean corporate giant is set to meet with Taylor City authorities this week to discuss details surrounding the massive project which is expected to cost $17 billion. According to Teslarati, an anonymous source told reporters that “so far, (Samsung Electronics) has thoroughly reviewed four to five locations for the ‘Star Project.’ What I learned is that (Samsung) finally chose Taylor after taking into account investment incentives and geographic conditions.” The sheer size of the new plant is astonishing: at over 51 million square feet, it will be over four times the size of the company’s factory in Austin, Texas.

 

“If you set up the new facility in a location away from the Austin plant, it could cost Samsung more for the supply of water and electricity and the establishment of other infrastructure. But the advantage is that it can operate the line in a more stable manner,” the insider said. Samsung is a key supplier to Tesla, and this move will clearly benefit both companies, with plans already in motion for a collaboration on a new version of Tesla’s FSD computer. The new plant will also benefit from tax breaks of over $300 million, and the groundbreaking ceremony has been pegged for the first quarter of 2022, with operations planned to begin in 2024.

 

 

read more
1 2 3 4
Page 4 of 4