Study peeks into the inner lives of Trinidad guppies
It sounds like a plot for the next sequel to “Finding Nemo” – scientists stressing out tropical fish to better analyze their inner lives.
It’s not the next Pixar project, though. Scientists from the University of Exeter have been studying guppies, and found that individual fish seem to have unique and complex personalities.
The finding wasn’t actually the initial goal of the study. Instead, they only wanted to see how risk-averse or risk-prone individual guppies were.
“The idea of a simple spectrum is often put forward to explain the behavior of individuals in species such as the Trinidadian guppy,” Dr. Tom Houslay said. “But our research shows that the reality is much more complex.”
For example, the researchers tried putting guppies in an unfamiliar environment to see which would hide and which would not. But instead of cooperating, the guppies showed a range of coping skills to deal with the stress. Some hid, some looked for a way to escape, some explored their new environment – their reactions to the new tank were varied.
Moreover, these personality traits of individual fish appeared to be consistent over time and in different situations. A fish that seemed more nervous than its fellows would always behave more nervously than its fellows in new situations, the researchers found.
“So, while the behavior of all the guppies changed depending on the situation – for example, all becoming more cautious in more stressful situations – the relative differences between individuals remained intact,” Houslay said.
In addition to moving the fish to a new tank initially, the scientists continued the experiment using models of birds or predatory fish to introduce stressful situations to the fish.
The researchers hope to continue learning about the guppies’ inner lives.
“We want to know how personality relates to other facets of life, and to what extent this is driven by genetic – rather than environmental – influences,” Dr. Alistair Wilson said. “The goal is really gaining insight into evolutionary processes, how different behavioral strategies might persist as species evolve.”
The study will be published in the journal Functional Ecology.
Could sea energy help power the future?
Scientists are looking to expand the options when it comes to clean energy. With wind and solar power sources already being developed, Japanese researchers are looking at another possibility: sea energy.
Dr. Tsumoru Shintake, a professor at the Okinawa Institute of Science and Technology’s Graduate University, began looking into new ways to harness the ocean’s power in 2012.
He and the university’s Quantum Wave Microscopy Unit anchored turbines to the ocean floor in the Kuroshio current, a north-flowing ocean current off the coast of Japan. The turbines were anchored with mooring cables that turned the current’s energy into electrical power.
The experiment, titled “Sea Horse,” was a success, and the team is now looking for industry sponsors who will help them move onto the next phase of the project.
“Particularly in Japan, if you go around the beach you’ll find many tetrapods,” Shintake said in a press release. “Surprisingly, 30 percent of the seashore in mainland Japan is covered with tetrapods and wave breakers.”
Tetrapods – concrete, pyramid-shaped structures that help soften the force of waves along the shore – and the beach walls dubbed wave breakers could be altered to serve a dual purpose, Shintake and his team said. “Intelligent” tetrapods and wavebreakers with nearby turbines could both protect Japan’s shoreline and turn sea energy into electricity for nearby communities.
“Using just 1 percent of the seashore of mainland Japan can [generate] about 10 gigawatts, which is equivalent to 10 nuclear power plants,” Shintake said. “That’s huge.”
To test the theory, the team has already placed some turbines near tetrapods and coral reefs, dubbed the Wave Energy Converter.
The turbines have been designed with Japan’s environment in mind. They’re built strong enough to withstand typhoons, and flexible so the currents and waves don’t destroy them, the scientists said. They’ve also been designed so that sea creatures can avoid their blades.
They plan to install half-scale models of the new turbines soon. If successful, the project could be copied in other places where strong ocean currents might provide clean sea energy.
Shintake and his team presented their project at the 3rd Asian Wave and Tidal Energy Conference.
If we want to stop rising temperatures, rapid changes must happen
Deep decarbonization, or the decreasing of carbon emissions on a mass scale, must happen quickly if we are to cap global warming at 2 degrees Celsius and stop temperatures from continuing to rise.
If there is any chance of achieving this, it will take a significant, collaborative effort whereby carbon emissions are reduced by 70 percent over the next 35 years, according to a new study published in the journal Science.
This will be no easy feat and depends on more than just scientific studies that prove the effects of climate change, as the researchers noted.
The problem with many climate change studies is their “one solution fits all” approach, looking for a single piece of the puzzle that will reduce carbon emissions worldwide without considering the current cultural, political, or economic climate.
Instead, researchers from the universities of Sussex, Manchester, and Oxford took all these factors into account and conducted a study to create a “socio-technical” framework that can be used a model for tackling rapid decarbonization successfully.
“Our ‘big picture’ socio-technical framework shows how interactions between various social groups can increase the momentum of low-carbon transitions,” said Professor Frank Geels from the University of Manchester and the lead author of the study.
The study includes a wide-reaching policy forum with four major steps, called lessons, that need to be taken for changes to occur.
The first is a transformation of ‘socio-technical systems,’ as speeding up the decarbonization process is not just dependent on new technology, but it must be widely accepted socially and politically as well.
The second step is the alignment of multiple innovations and systems. If there is a major shift in renewable energy, there will also have to be technical advances made in energy storage and demand for those innovations.
“One of the great strengths of this study is the equal emphasis it accords to technological, social, business and policy innovation, in all of which governments, as well as the private sector, have a key role to play. Public policy has an enormous role to play at every step in the creation of these changed conditions,” said Paul Ekins, the Director of the University College London Institute for Sustainable Resources.
The third step involves making sure the right tools are in place to support and motivate businesses and people transitioning to renewable energy and low carbon lifestyles.
The fourth step requires the removal of existing systems as new technologies and innovations are adopted.
This research not only shows how vital a collaborative effort is to policymakers, scientists, business and the public sector, but also proves that it will be the only way to successfully achieve the rapid decarbonization.
Climate change responsible for extreme cold during the winter
The impact of climate change high up in the Arctic can be felt in other parts of the Northern Hemisphere, according to a new study. When the high-altitude winds that circle this region lose strength, cold polar air escapes. Researchers have linked a weakening polar vortex to extreme winter cold in Europe and Russia.
Marlene Kretschmer from the Potsdam Institute for Climate Impact Research is the lead author of the study.
“In winter, the freezing Arctic air is normally ‘locked’ by strong circumpolar winds several tens of kilometers high in the atmosphere, known as the stratospheric polar vortex, so that the cold air is confined near the pole,” said Kretschmer. “We found that there’s a shift towards more-persistent weak states of the polar vortex. This allows frigid air to break out of the Arctic and threaten Russia and Europe with cold extremes. In fact this can explain most of the observed cooling of Eurasian winters since 1990.”
While winters in parts of the northeastern United States and Eurasia have been abnormally cold despite climate change, the Arctic has been rapidly warming. Researchers have found that these occurrences are most likely linked.
As global warming melts sea ice north of Scandinavia and Russia, warmth from the ocean is released into the atmosphere. This warmer air can reach as high as 30 kilometers, disturbing the polar vortex. High-altitude polar wind circling the Arctic begins to lose strength and begins to favor the mid-latitudes.
“Our latest findings not only confirm the link between a weak polar vortex and severe winter weather, but also calculated how much of the observed cooling in regions like Russia and Scandinavia is linked to the weakening vortex. It turns out to be most,” said co-author Judah Cohen. “Several types of weather extremes are on the rise with climate change, and our study adds evidence that this can also include cold spells, which is an unpleasant surprise for these regions.”
Research efforts are underway to see how climate change in the Arctic may also impact other parts of the planet.
“It is very important to understand how global warming affects circulation patterns in the atmosphere,” said co-author Dim Coumou. “Jet Stream changes can lead to more abrupt and surprising disturbances to which society has to adapt. The uncertainties are quite large, but global warming provides a clear risk given its potential to disturb circulation patterns driving our weather – including potentially disastrous extremes.”
20 small earthquakes this week alone in Mount Rainier, Washington
The United States Geological Survey Cascades Volcano Observatory (CVO) reported an uptick in earthquake activity at Mount Rainier. Over 20 small quakes were monitored by the Pacific Northwest Seismic Network, a collaborative effort between the University of Washington and the University of Oregon that monitors earthquakes and volcanic activity in the Pacific Northwest.
The CVO was quick to point out that although Mount Rainier typically only experiences about two earthquakes large enough to be recorded by seismic stations a week, swarms like this are not uncommon.
Similar swarms of mini-quakes were recorded in 2009, 2015, and 2016. The earthquakes in this most recent swarm were shallow, reportedly at one to two kilometers deep, and only reached a maximum magnitude of 1.6.
Mount Rainier is the highest peak in Cascade Mountain Range and the last notable eruption (when magma was released) was 1,000 years ago.
Scientists believe the recent mini-quakes are the result of events happening in Mount Rainier’s hydrothermal system. The hydrothermal system runs beneath the volcano carrying hot, mineral-rich water.
According to the CVO, earthquakes occur when cracks in the hydrothermal system seal shut and pressure and heat build up until the cracks break open..
To monitor the Mount Rainier Hydrothermal system, seismologists and scientists have set up three monitoring stations at the Rainier summit fumaroles, the Paradise Warm Springs and Paradise Creek, and the Nisqually River.
Fumaroles act like vents to release hot volcanic gases and scientists have tested the gas emissions and temperatures coming to the Rainier fumaroles for many years.
Temperature probes have been deployed at the Paradise Warm Springs and Nisqually River to continually monitor hydrothermal outflow coming from Mount Rainier.
These monitoring stations are part of a bigger monitoring network that tracks the volcano’s activity and help scientists give clear warning signs when major activity about to occur.
Cutting back on tropical seafood could help save coral reefs
Research from the University of British Columbia has found that reducing the amount of reef fish consumed by tourists in Palau is urgently needed to protect the health of coral reefs and maintain the ecological balance of the ocean. While the study is focused on Palau, a group of 700 islands in the South Pacific, the experts recommend that other small island nations adopt this strategy.
Climate change is projected to cause sharp declines in Palau’s reefs. Researchers have determined that the best management strategy is to reduce tourist consumption of reef fish by over 70 percent.
“Palau’s reefs and the fish communities they host are incredibly beautiful and recognized worldwide as a top diving destination,” said lead author Colette Wabnitz.”Tourist numbers can reach nine times the local population and most come to enjoy the ocean. This puts enormous pressure on local marine resources that are central to local communities’ culture, food security and livelihoods.”
Palau relies is heavily on tourism. Previous research has been focused primarily on physical damage to coral reefs by tourists, but this is the first study to evaluate the impact of visitors’ consumption of reef fish.
Using a social-ecological computer model, the researchers set out to explore policy scenarios involving tourism, climate change, marine conservation, and local food security. The amount of fish being eaten stood out as a major contributor to future ecosystem declines.
Popular reef fish dishes include grouper, snapper, and parrotfish. The study showed that the health of reefs would benefit greatly by shifting from this type of seafood consumption to open water fish such as sustainably-harvested tuna.
“Dining habits are removing important fish species from local reefs, and it’s ironic that viewing these fish is the reason people come in the first place. This is an important step that can be taken now, rather than a future adaptation to climate change,” said co-author Andrés Cisneros-Montemayor. “Sustainable tourism, especially ecotourism, shouldn’t threaten the food security of local people or their environment.”
New algorithm can predict extreme environmental events
A new method for predicting extreme events has been developed by researchers from the Massachusetts Institute of Technology (MIT). Extreme events refer to incidents like a previously stable species going extinct, a wave rising from calm waters, or an instability inside of a wind turbine.
These random events happen all over but scientists have lacked sufficient data and technology to be able to predict them with any real accuracy until now.
Engineers from MIT have developed a framework that combines real-world data and already existing equations to predict extreme events.
The results of these calculations show that extreme events can in fact be predicted, and that the model is general enough to apply to many systems where events can occur.
“This happens in random places around the world, and the question is being able to predict where these vortices or hotspots of extreme events will occur. If you can predict where these things occur, maybe you can develop some control techniques to suppress them,” said Themistoklis Sapsis, an associate professor of mechanical and ocean engineering at MIT.
Previously, when attempting to predict extreme events, scientists relied on solving a series of complex mathematical equations that could indicate certain initial conditions were a precursor to an extreme event.
But these equations, as Sapsis noted, were not foolproof, and often resulted in inaccurate and unrealistic predictions.
Scientists who wanted to use real-world data to try and find some commonality or indicator that predicted the events also struggled because extreme events are random and rare.
Instead, Sapsis and fellow MIT collaborators created an algorithm that uses both the applicable equations and any known real-world data to create a more stable predictor of extreme events.
The researchers then simulated a turbine fluid flow and used the algorithm to identify the precursors to an extreme event. The method identified developing precursors 75 to 99 percent of the time.
Having an accurate model that predicts extreme events will be incredibly useful for future studies and for finding methods to prevent such events from occurring.
Exposure to air pollution could damage kidney function
Many studies have proven the link between air pollution and cardiovascular disease. As harmful pollutants enter the airways and bloodstream, they can cause artery blockages, a buildup of fatty deposits, heart attacks, and fatal heart failure. However, a new study discovered that air pollution affects not only cardiovascular health, but also kidney function.
The study, published in the Journal of the American Society of Nephrology (JASN), links air pollution levels to kidney failure, kidney disease, and kidney function decline. The findings even show that kidney health can be impaired at relatively low levels of air pollution.
“Even levels below the limit set by the EPA [Environmental Protection Agency] were harmful to the kidneys. This suggests that there is no safe level of air pollution,” said Dr. Ziyad Al-Aly, Director of Clinical Epidemiology at the VA Saint Louis Health Care System and leader of the study.
For the study data was gathered on 2,482,737 US veterans using databases from the Environmental Protection Agency (EPA) and the Department of Veterans Affairs.
The veteran participants were followed for eight and a half years while the researchers measured kidney health and air pollution levels.
The researchers found a direct correlation between kidney decline and air pollution levels.
Each year, 44,793 new cases of chronic kidney disease are reported, and according to this new study, could be directly caused by air pollution levels exceeding the EPA’s standards.
This new research shows not only that there is a link between kidney disease and air pollution, but also that even levels below EPA recommendations can have a damaging effect.
Ancient DNA reveals how long fish have been living in our lakes
Researchers at the Department of Ecology and Environmental Science at Umeå University have recently found that fish DNA binds to lake sediment, forming a natural archive that can reveal when certain fish species colonized lakes after the glacial period.
For this study, the research team chose Lake Stora Lögdasjön and Lake Hotagen – which are in Sweden, in case the names didn’t tip you off enough. The reason behind these choices was that these lakes were known to have whitefish that colonized them at specific points in time. Stora Lögdasjön was connected to the Baltic Sea when the inland ice melted about 10,000 years ago. That connection was then cut off 9,200 years ago when the land lifted up and created a waterfall, which the whitefish were unable to travel up.
The researchers analyzed the prevalence of whitefish DNA in the sediment, finding that whitefish came to Stora Lögdasjön roughly 10,000 years ago, but they only arrived at Hotagen about 2,200 years ago.
“Our hypothesis was that the whitefish colonized Stora Lögdasjön immediately after the ice-melt, which turned out accurate. Close to Hotagen, on the other side, there was a waterfall that prevented the whitefish from colonizing the lake after the ice melted,” says Göran Englund, one of the researchers behind the study.
However, the DNA molecules that remain in the lake sediment are sparse. New methods of extraction had to be developed and difficult analyses took place in order for the data to be properly collected.
“Being able to map the prevalence of DNA in lake sediments is now opening up a new window into history, which lets us see how nature has developed over a long period of time,” says Göran Englund. “We have already started a project aiming to study how lake ecosystems are affected by historical climate changes. That can provide important clues to a better understanding of how the current global warming will affect ecosystems.”
The ability to look into the past using the technology of today is an important advancement that allows us to better understand the world around us. Studies such as this may initially seem trivial in their scope, but serve a greater purpose of furthering our knowledge of new research techniques that could benefit scientists around the world.
Hurricane Maria slams Puerto Rico, whole island without power
Hurricane Maria made landfall in Puerto Rico this morning as the strongest storm the island has seen in almost 90 years. The Category 4 hurricane roared onto the island around 6:15 a.m. with 155 mph winds, ripping roofs off of buildings and knocking out power across the entire territory.
By mid-morning, the hurricane had completely engulfed the 100-mile island. The National Weather Service in San Juan reported rainfall of up to 7 inches an hour. Residents reported windows shattering, buildings shaking, and streets turning into muddy rivers.
While anticipating the arrival of Hurricane Maria, Puerto Rico’s governor Ricardo Rossello told the Associated Press, “This is going to be an extremely violent phenomenon. We have not experienced an event of this magnitude in our modern history.”
Puerto Rico Emergency Management Agency (PREMA) Director Abner Gomez was braced for the worst at a press conference today. “When we can get outside, we will find our island destroyed,” he said. “The information we have received is not encouraging. It’s a system that has destroyed everything it has had in its path.”
On its way to Puerto Rico, Hurricane Maria ripped through the Caribbean after rapidly intensifying in strength from a Category 1 to a Category 5 storm in just 15 hours. The hurricane slammed into the tiny nation of Dominica Monday night with 160 mph winds, making it the strongest storm in the recorded history of the island.
As buildings are filling up with water and rivers are overflowing in Puerto Rico today, torrential rain and strong winds are expected to persist through this evening. Hurricane Maria will pass to the east of the Turks and Caicos as a Category 3 storm on Friday, and then will pass well to the east of the Bahamas by Saturday. Beyond this, the path of the storm is unclear.
“At this point, it looks like Maria will miss the United States and will move out to sea sometime later next week,” said ABC News meteorologist Max Golembo. “But this will be a close call, so we will be watching carefully.”
Governor Rossello took to Twitter today to encourage the residents of Puerto Rico. He tweeted, “God is with us; we are stronger than any hurricane. Together we will lift up.”
Earth’s sixth mass extinction could occur by 2100
Our Earth has gone through five mass extinction events in the last 540 million years. Each of these events came with an enormous shift in the normal cycling of carbon through the Earth’s atmosphere and oceans, unfolding over thousands to millions of years and resulting in the extermination of marine species across the planet.
Carbon emissions are a major part of what scientists believe to be expediting climate change in our world today, causing a lot of the scientific community to wonder whether the carbon cycle may be on the verge of shifting again – with the consequences being a sixth mass extinction.
Determining whether or not our recent dramatic increase in carbon emissions – which have risen since the 19th century – could actually result in another mass extinction is no easy task. For scientists, it’s tough to relate the carbon anomalies of the ancient past, which occurred over thousands to millions of years, to the carbon increases today that have only been happening for the last century.
However, in a recent paper published in Science Advances, Daniel Rothman, a professor of geophysics at the Massachusetts Institute of Technology (MIT), analyzed the changes that occurred in the carbon cycle during the last five extinction events, allowing him to identify the “thresholds of catastrophe” in the Earth’s carbon cycle. He posits that if these thresholds are exceeded, it would lead to an unstable environment and trigger a mass extinction.
Rothman writes that mass extinction could occur if one of two thresholds is crossed. The first is for changes in the carbon cycle – occurring over long timescales – to happen at rates faster than the planet’s ecosystems can adapt. The second is for carbon perturbations that occur over shorter timescales (such as what we’re experiencing today), where the rate of the changes actually do not matter, rather, the magnitude of the change will determine the likelihood of another mass extinction.
Projecting outward, Rothman predicts that a sixth extinction event would depend on if a critical threshold of added carbon to the oceans is reached. He calculates this amount to be roughly 310 gigatons, and estimates that human activities will have added this amount to the Earth’s oceans by the year 2100.
“This is not saying that disaster occurs the next day,” Rothman clarifies. “It’s saying that, if left unchecked, the carbon cycle would move into a realm which would be no longer stable, and would behave in a way that would be difficult to predict. In the geologic past, this type of behavior is associated with mass extinction.”
The author says it could take about 10,000 years after this threshold is reached for any true extinction to occur. But the main issue is that by this time, we will have ventured into “unknown territory.”
Rothman determined these data points by deriving a mathematical formula based on physical principles relating the critical rate and magnitude of change in the carbon cycle to the timescale that divides fast and slow change. “It became evident that there was a characteristic rate of change that the system basically didn’t like to go past,” Rothman says.
Under the best-case scenario, humans will add 300 gigatons of carbon to our oceans by 2100. In the worst-case scenario, the number could be as high as 500 gigatons. If Rothman’s calculations are accurate, the carbon cycle will either be close to or over the threshold by 2100.
“There should be ways of pulling back [carbon dioxide emissions],” Rothman says. “But this work points out reasons why we need to be careful, and it gives more reasons for studying the past to inform the present.”
Severe 7.1 earthquake strikes Mexico City, death toll rising
A severe earthquake with a 7.1 magnitude struck Mexico City on Tuesday September 19, 2017 at 2:15pm EST, killing over 100 people and destroying many large buildings. This, less than two weeks after an 8.1 magnitude earthquake occurred off the western coast of Mexico on September 8th that left nearly 100 dead.
As of midday Wednesday, the death toll had already risen to over 225 people killed.
According to the Associated Press, the potent quake was felt across a vast and heavily-populated area of central Mexico, causing power outages and collapsed buildings. In Mexico City alone, over 44 buildings have been reduced to rubble.
The epicenter of the earthquake has been estimated to be roughly 2.8 miles northeast of San Juan Raboso and 34 miles southwest of Puebla, according to the US Geological Survey. The quake struck at a depth of 33 miles.
The day also marked the anniversary of the devastating 1985 earthquake, which shook Mexico City with an 8.1 magnitude and killed thousands.
Just south of Mexico City, the Enrique Rebsamen school was partially flattened, killing at least 25 people, most of which were children.