More data, more land reclamation success: Soil assessments pay off in faster regeneration, lower costs

More data, more land reclamation success: Soil assessments pay off in faster regeneration, lower costs

More than 2.4 million miles of energy pipelines crisscross the United States. If assembled end-to-end, they would circle the Earth almost 100 times!

Energy pipelines transport products such as crude oil or natural gas. Some of the pipelines are above ground, but most of them are buried. Often, energy pipelines pass through previously undisturbed areas. These areas need to be managed carefully to re-establish ecologically functioning systems. This complex process is called land reclamation.

A new study shows teams can increase the chance of successful land reclamation by first collecting soil data at short intervals. More collections can also lead to significant cost savings.

The study focused on a 170-mile length of natural gas pipeline in West Virginia. The land reclamation plan was to re-establish previously forested areas as a grassy, low-growth system.

Initially, all 170 miles had the same, standard reclamation plan based on more generic soil information. “But we know that soils can vary on a mile-to-mile basis,” says James Hartsig, a scientist at Duraroot, LLC in Keenesburg, CO.

The team of scientists gathered more soil data. They collected approximately 350 soil samples along the pipeline. Samples were collected every half-mile.

These samples were sent to accredited laboratories. There, technicians identified critical chemical and physical soil characteristics. “There were initially no sampling efforts involved in the project. We brought sampling efforts to the project for increased and accelerated vegetation. These soil fertility assessments helped us make more nuanced recommendations,” says Hartsig.

One area of focus was soil pH levels-a measure of soil acidity.

“Soil pH values are critical for several reasons,” says Hartsig. For example, soil pH helps determine nutrient availability for plants. Most plants prefer pH values between 5.5 and 7.

But getting a handle on soil pH values can be challenging, says Hartsig. Soils often have different pH values even within a mile. With data points every half-mile, the scientists could overcome these challenges. Throughout the site, pH values ranged from 4.5 to 8.5.

Typically, lime is added to soils to bring the pH within the desired range. Along the West Virginia pipeline, the standard recommendation was to add two tons of lime per acre of soil. But, based on the soil fertility assessments, Hartsig and colleagues had a more specific plan. They recommended no lime for almost half the pipeline. They also recommended three tons per acre for another quarter of the pipeline.

The team also fine-tuned how much fertilizer to add.

“We found that applying the appropriate amount of lime and fertilizer leads to faster revegetation efforts,” says Hartsig. “Plus, there’s no need to go back to certain areas for re-application.”

Not having to reapply lime or fertilizer can be a big cost saver. That’s especially true in the mountainous areas of West Virginia where application processes can be very expensive.

These findings can improve land reclamation success in states other than West Virginia as well. “Regardless of geographical setting, these types of assessments can provide much better insight into soil properties,” says Hartsig. That’s key, especially for longer pipelines. In fact, “the longer the pipeline alignment, the better the chances that the standard specification will be refined.”

Hartsig and colleagues are continuing to research ways to enhance land reclamation. “We are currently testing different types of lime and fertilizers,” he says. “We are also exploring different application methods.” This could include drill seeding with fertilizer injections, broadcast seeding with fertilizer and lime, or aerially applying seed or fertilizers.

Ultimately, says Hartsig, the goal is to accelerate land reclamation success.

Hartsig presented these findings during the International Meeting of the Soil Science Society of America in San Diego, Jan. 4-7.

View the Original Article . . .

{authorlink}
https://www.sciencedaily.com/rss/top/environment.xml Top Environment News — ScienceDaily

Top stories featured on ScienceDaily’s Plants & Animals, Earth & Climate, and Fossils & Ruins sections.

https://www.sciencedaily.com/images/scidaily-logo-rss.png

A warming world increases air pollution: Climate change is warming the ocean, but it’s warming land faster and that’s really bad news for air quality

A warming world increases air pollution: Climate change is warming the ocean, but it’s warming land faster and that’s really bad news for air quality

Climate change is warming the ocean, but it’s warming land faster and that’s really bad news for air quality all over the world, says a new University of California, Riverside study.

The study, published February 4 in Nature Climate Change, shows that the contrast in warming between the continents and sea, called the land-sea warming contrast, drives an increased concentration of aerosols in the atmosphere that cause air pollution.

Aerosols are tiny solid particles or liquid droplets suspended in the atmosphere. They can come from natural sources, like dust or wildfires, or human-made sources such as vehicle and industrial emissions. Aerosols affect the climate system, including disturbances to the water cycle, as well as human health. They also cause smog and other kinds of air pollution that can lead to health problems for people, animals, and plants.

“A robust response to an increase in greenhouse gases is that the land is going to warm faster than the ocean. This enhanced land warming is also associated with increased continental aridity,” explained first author Robert Allen, an associate professor of earth sciences at UC Riverside.

The increase in aridity leads to decreased low cloud cover and less rain, which is the main way that aerosols are removed from the atmosphere.

To determine this, the researchers ran simulations of climate change under two scenarios. The first assumed a business-as-usual warming model, in which warming proceeds at a constant, upward rate. The second model probed a scenario in which the land warmed less than expected.

In the business-as-usual scenario, enhanced land warming increased continental aridity and, subsequently, the concentration of aerosols that leads to more air pollution. However, the second model — which is identical to the business-as-usual model except the land warming is weakened — leads to a muted increase in continental aridity and air pollution. Thus, the increase in air pollution is a direct consequence of enhanced land warming and continental drying.

The results show that the hotter Earth gets, the harder it’s going to be to keep air pollution down to a certain level without strict control over the sources of aerosols.

Because the researchers wanted to understand how greenhouse gas warming affects air pollution, they assumed no change to human-made, or anthropogenic, aerosol emissions.

“That’s probably not going to be true because there’s a strong desire to reduce air pollution, which involves reducing anthropogenic aerosol emissions,” cautioned Allen. “So this result represents an upper bound.”

But it also suggests that if the planet keeps warming, larger reductions in anthropogenic aerosol emissions will be required to improve air quality.

“The question is what level of air quality are we going to accept,” said Allen. “Even though California has some of the strictest environmental laws in the country we still have relatively poor air quality, and it’s much worse in many countries.”

Unless anthropogenic emission reductions occur, a warmer world will be associated with more aerosol pollution.

View the Original Article . . .

{authorlink}
https://www.sciencedaily.com/rss/top/environment.xml Top Environment News — ScienceDaily

Top stories featured on ScienceDaily’s Plants & Animals, Earth & Climate, and Fossils & Ruins sections.

https://www.sciencedaily.com/images/scidaily-logo-rss.png

How Virtual Reality Will Transform Medicine

How Virtual Reality Will Transform Medicine

If you still think of virtual reality as the province of dystopian science fiction and geeky gamers, you had better think again. Faster than you can say “Ready Player One,” VR is starting to transform our world, and medicine may well be the first sector where the impact is profound. Behavioral neuroscientist Walter Greenleaf of Stanford University has been watching this field develop since the days when VR headsets cost $75,000 and were so heavy, he remembers counterbalancing them with a brick. Today some weigh about a pound and cost less than $200. Gaming and entertainment are driving current sales, but Greenleaf predicts that “the deepest and most significant market will be in clinical care and in improving health and wellness.”

Even in the early days, when the user entered a laughably low-resolution world, VR showed great promise. By the mid-1990s research had shown it could distract patients from painful medical procedures and ease anxiety disorders. One initial success was SnowWorld, which immersed burn patients in a cool, frozen landscape where they could lob snowballs at cartoon penguins and snowmen, temporarily blocking out the real world where nurses were scrubbing wounds, stretching scar tissue and gingerly changing dressings. A 2011 study with 54 children in burn units found an up to 44 percent reduction in pain during VR sessions—with the bonus that these injured kids said they had “fun.”

Another success came in the wake of 9/11. Psychologist JoAnn Difede of NewYork-Presbyterian/Weill Cornell Medical Center began using VR with World Trade Center survivors suffering from post-traumatic stress disorder (PTSD) and later with soldiers returning from Afghanistan and Iraq.

In Difede’s laboratory, I saw the original 9/11 VR program with its scenes of lower Manhattan and the newer Bravemind system, which depicts Iraqi and Afghan locales. Developed with Department of Defense funding by Albert “Skip” Rizzo and Arno Hartholt, both at the University of Southern California, Bravemind is used to treat PTSD at about 100 U.S. sites. The approach is based on exposure therapy, in which patients mentally revisit the source of their trauma guided by a therapist who helps them form a more coherent, less intrusive memory. In VR, patients do not merely reimagine the scene, they are immersed in it.

Difede showed me how therapists can customize scenes in Bravemind to match a patient’s experience. A keystroke can change the weather, add the sound of gunfire or the call to prayers. It can detonate a car bomb or ominously empty a marketplace. An optional menu of odors enables the patient to sniff gunpowder or spices through a metal tube. “What you do with exposure therapy is systematically go over the trauma,” Difede explains. “We’re teaching the brain to process and organize the memory so that it can be filed away and no longer intrudes constantly in the patient’s life.” The results, after nine to 12 gradually intensifying sessions, can be dramatic. One 2010 study with 20 patients found that 16 no longer met the criteria for PTSD after VR treatment.

Until recently, large-scale studies of VR have been missing in action. This is changing fast with the advent of cheaper, portable systems. Difede, Rizzo and three others just completed a randomized controlled trial with nearly 200 PTSD patients. Expected to be published this year, it may shed light on which patients do best with this high-tech therapy and which do not. In a study with her colleague, burn surgeon Abraham Houng, Difede is aiming to quantify the pain-distraction effects of a successor to SnowWorld called Bear Blast, a charming VR game in which patients toss balls at giggly cartoon bears. They will measure whether burn patients need lower doses of intravenous painkillers while playing.

Greenleaf counts at least 20 clinical arenas, ranging from surgical training to stroke rehabilitation to substance abuse where VR is being applied. It can, for example, help recovering addicts avoid relapses by practicing “refusal skills”—turning down drinks at a virtual bar or heroin at a virtual party. Brain imaging suggests that such scenes can evoke very real cravings, just as Bravemind can evoke the heart-racing panic of a PTSD episode. Researchers foresee a day when VR will help make mental health care cheaper and more accessible, including in rural areas.

In a compelling 2017 paper that reviews 25 years of work, Rizzo and co-author Sebastian Koenig ask whether clinical VR is finally “ready for primetime.” If today’s larger studies bear out previous findings, the answer seems to be an obvious “yes.”

View the Original Article . . .

Claudia Wallis {authorlink}
http://rss.sciam.com/sciam/technology Scientific American: Technology

Science news and technology updates from Scientific American

https://static.scientificamerican.com/sciam/assets/Image/newsletter/salogo.png

Seas may be rising faster than thought: Current method of measuring sea-level rise may not be reliable

Seas may be rising faster than thought: Current method of measuring sea-level rise may not be reliable

A new Tulane University study questions the reliability of how sea-level rise in low-lying coastal areas such as southern Louisiana is measured and suggests that the current method underestimates the severity of the problem.

Relative sea-level rise, which is a combination of rising water level and subsiding land, is traditionally measured using tide gauges. But researchers Molly Keogh and Torbjörn Törnqvist argue that in coastal Louisiana, tide gauges tell only a part of the story.

Tide gauges in such areas are anchored an average of 20 meters into the earth rather than at the ground surface. “As a result, tide gauges do not record subsidence occurring in the shallow subsurface and thus underestimate rates of relative sea-level rise,” said Keogh, a fifth year PhD student and lead author of the study.

“This study shows that we need to completely rethink how we measure sea-level rise in rapidly subsiding coastal lowlands” said Törnqvist, Vokes Geology Professor in the Tulane School of Science and Engineering.

The study, published in the open-access journal Ocean Science, says that while tide gauges can accurately measure subsidence that occurs below their foundations, they miss out on the shallow subsidence component. With at least 60 percent of subsidence occurring in the top 5 meters of the sediment column, tide gauges are not capturing the primary contributor to relative sea-level rise.

An alternative approach is to measure shallow subsidence using surface-elevation tables, inexpensive mechanical instruments that record surface elevation change in wetlands. Coastal Louisiana already has a network of more than 300 of these instruments in place. The data can then be combined with measurements of deep subsidence from GPS data and satellite measurements of sea-level rise, Keogh said.

Rates of relative sea-level rise obtained from this approach are substantially higher than rates as inferred from tide-gauge data. “We therefore conclude that low-elevation coastal zones may be at higher risk of flooding, and within a shorter time horizon, than previously assumed,” Keogh said.

She said the research has implications for coastal communities across the globe.

“Around the world, communities in low-lying coastal areas may be more vulnerable to flooding than we realized. This has implications for coastal management, city planners and emergency planners. They are planning based on a certain timeline, and if sea level is rising faster than what they are planning on, that’s going to be a problem.”

The research was funded by the National Science Foundation.

Story Source:

Materials provided by Tulane University. Note: Content may be edited for style and length.

View the Original Article . . .

{authorlink}
https://www.sciencedaily.com/rss/top/environment.xml Top Environment News — ScienceDaily

Top stories featured on ScienceDaily’s Plants & Animals, Earth & Climate, and Fossils & Ruins sections.

https://www.sciencedaily.com/images/scidaily-logo-rss.png

An overpriced smart home gizmo from six years ago suddenly gets interesting

An overpriced smart home gizmo from six years ago suddenly gets interesting

It happens. Smart home gadgets come out faster than we can test them, and we end up late to the review. It’s a busy beat!

Never has this been more true than with LaMetric Time. First conceived nearly six years ago, the customizable, pixelated desktop clock that can display smart home alerts and social notifications went on to secure funding on Kickstarter, and is now available at a much-too-steep $199. It always struck me as a nifty-looking novelty that cost way too much, so we never got around to testing it out.

Things changed at CES 2019, though, when LaMetric resurfaced with a new product: pixelated LaMetric Sky LED wall panels that can either serve as an elegant, abstract art piece or, if you arrange enough of them into a rectangle, as a wall-mounted version of the Time’s notification center. They looked great in person in Las Vegas, and the pixelated approach lets them do things that the current top name in LED wall panels — Nanoleaf — can’t. 

Pricing on those panels isn’t set yet, but LaMetric’s team tells me that they’re aiming to compete with what’s already available, which would peg them at somewhere around $200 for a starter kit.

All of that suddenly makes the Time interesting — if only to get a sense of the features that might translate well to your walls, as well as the potential hiccups that could derail the idea altogether. Six years in and still 200 bucks, the Time still costs at least twice as much as I’d be willing to spend on it, but with terrific-looking wall panel versions coming at the end of the year, this amped-up desk clock merits a closer look.


Chris Monroe/CNET

Definition of a desktop doodad

You can link LaMetric with smart lights from Lifx or Philips Hue, then turn things on and off with a tap.


Chris Monroe/CNET

The LaMetric Time is an unassuming-looking black bar of plastic with a pixelated LED display on its front face and a trio of buttons up top. Plug it in, and it’ll start broadcasting a Wi-Fi signal. Download the LaMetric app, connect with that signal and sync it up with your home network, and you’ll be able to customize and control the display right from your phone.

Your options are pretty vast, with a long list of user-created, Time-specific “apps” that you can download from within LaMetric’s smart phone app. Each one enables your Time to show you something new. Usual suspects like Twitter mentions, Facebook likes and Slack notifications are all accounted for, along with apps for weather updates and breaking news headlines and others that will display a countdown timer or a stopwatch.

There’s a pretty sizable library of free apps that you can download for your Time’s display.


Chris Monroe/CNET

Other apps let the Time move beyond passive display in order to take a more active role. For instance, smart home apps for brands like Philips Hue, Lifx and Nest let you see the status of your gadgets at a glance, then turn things on or off with the press of a button. A built-in internet radio app lets you pick your favorite streaming stations and play them through the speaker built into the device (I programmed mine to stream Live Phish Radio with just a tap). And yep, this thing works pretty well as an alarm clock, too.

You can cycle between all of those app functions by pressing the left and right buttons on the top of the device, or you can set the LaMetric Time to cycle through your apps automatically. I found the latter approach to be a bit distracting on my desk at work as I tested this thing out, so I ended up leaving it locked on the main clock display, with notifications for things like Twitter mentions or new emails from the boss only showing up as they occurred.

That main, default display also lets you pick a custom icon from an extensive library of user-created options. If none of those work, you can create your own, complete with surprisingly simple controls for animating it.

All of that makes it really easy to personalize the LaMetric Time to your specific interests — but if you really want to unlock its full potential, you’ll need an assist from IFTTT.

View the Original Article . . .

Ry Crist {authorlink}
https://www.cnet.com/g00/3_c-6bbb.hsjy.htr_/c-6RTWJUMJZX77x24myyux78x3ax2fx2fbbb.hsjy.htrx2fwx78x78x2fwjanjbx78x2f_$/$/$/$?i10c.ua=1&i10c.dv=14 CNET Reviews – Most Recent Reviews

CNET brings you the top unbiased editorial reviews and ratings for tech products, along with specs, user reviews, prices and more.

http://i.i.cbsi.com/cnwk.1d/i/ne/gr/prtnr/CNET_Logo_150.gif

A faster, more efficient cryptocurrency

A faster, more efficient cryptocurrency

MIT researchers have developed a new cryptocurrency that drastically reduces the data users need to join the network and verify transactions — by up to 99 percent compared to today’s popular cryptocurrencies. This means a much more scalable network.

Cryptocurrencies, such as the popular Bitcoin, are networks built on the blockchain, a financial ledger formatted in a sequence of individual blocks, each containing transaction data. These networks are decentralized, meaning there are no banks or organizations to manage funds and balances, so users join forces to store and verify the transactions.

But decentralization leads to a scalability problem. To join a cryptocurrency, new users must download and store all transaction data from hundreds of thousands of individual blocks. They must also store these data to use the service and help verify transactions. This makes the process slow or computationally impractical for some.

In a paper being presented at the Network and Distributed System Security Symposium next month, the MIT researchers introduce Vault, a cryptocurrency that lets users join the network by downloading only a fraction of the total transaction data. It also incorporates techniques that delete empty accounts that take up space, and enables verifications using only the most recent transaction data that are divided and shared across the network, minimizing an individual user’s data storage and processing requirements.

In experiments, Vault reduced the bandwidth for joining its network by 99 percent compared to Bitcoin and 90 percent compared to Ethereum, which is considered one of today’s most efficient cryptocurrencies. Importantly, Vault still ensures that all nodes validate all transactions, providing tight security equal to its existing counterparts.

“Currently there are a lot of cryptocurrencies, but they’re hitting bottlenecks related to joining the system as a new user and to storage. The broad goal here is to enable cryptocurrencies to scale well for more and more users,” says co-author Derek Leung, a graduate student in the Computer Science and Artificial Intelligence Laboratory (CSAIL).

Joining Leung on the paper are CSAIL researchers Yossi Gilad and Nickolai Zeldovich, who is also a professor in the Department of Electrical Engineering and Computer Science (EECS); and recent alumnus Adam Suhl ’18.

Vaulting over blocks

Each block in a cryptocurrency network contains a timestamp, its location in the blockchain, and fixed-length string of numbers and letters, called a “hash,” that’s basically the block’s identification. Each new block contains the hash of the previous block in the blockchain. Blocks in Vault also contain up to 10,000 transactions — or 10 megabytes of data — that must all be verified by users. The structure of the blockchain and, in particular, the chain of hashes, ensures that an adversary cannot hack the blocks without detection.

New users join cryptocurrency networks, or “bootstrap,” by downloading all past transaction data to ensure they’re secure and up to date. To join Bitcoin last year, for instance, a user would download 500,000 blocks totaling about 150 gigabytes. Users must also store all account balances to help verify new users and ensure users have enough funds to complete transactions. Storage requirements are becoming substantial, as Bitcoin expands beyond 22 million accounts.

The researchers built their system on top of a new cryptocurrency network called Algorand — invented by Silvio Micali, the Ford Professor of Engineering at MIT — that’s secure, decentralized, and more scalable than other cryptocurrencies.

With traditional cryptocurrencies, users compete to solve equations that validate blocks, with the first to solve the equations receiving funds. As the network scales, this slows down transaction processing times. Algorand uses a “proof-of-stake” concept to more efficiently verify blocks and better enable new users join. For every block, a representative verification “committee” is selected. Users with more money — or stake — in the network have higher probability of being selected. To join the network, users verify each certificate, not every transaction.

But each block holds some key information to validate the certificate immediately ahead of it, meaning new users must start with the first block in the chain, along with its certificate, and sequentially validate each one in order, which can be time-consuming. To speed things up, the researchers give each new certificate verification information based on a block a few hundred or 1,000 blocks behind it — called a “breadcrumb.” When a new user joins, they match the breadcrumb of an early block to a breadcrumb 1,000 blocks ahead. That breadcrumb can be matched to another breadcrumb 1,000 blocks ahead, and so on.

“The paper title is a pun,” Leung says. “A vault is a place where you can store money, but the blockchain also lets you ‘vault’ over blocks when joining a network. When I’m bootstrapping, I only need a block from way in the past to verify a block way in the future. I can skip over all blocks in between, which saves us a lot of bandwidth.”

Divide and discard

To reduce data storage requirements, the researchers designed Vault with a novel “sharding” scheme. The technique divides transaction data into smaller portions — or shards — that it shares across the network, so individual users only have to process small amounts of data to verify transactions.

To implement sharing in a secure way, Vault uses a well-known data structure called a binary Merkle tree. In binary trees, a single top node branches off into two “children” nodes, and those two nodes each break into two children nodes, and so on.

In Merkle trees, the top node contains a single hash, called a root hash. But the tree is constructed from the bottom, up. The tree combines each pair of children hashes along the bottom to form their parent hash. It repeats that process up the tree, assigning a parent node from each pair of children nodes, until it combines everything into the root hash. In cryptocurrencies, the top node contains a hash of a single block. Each bottom node contains a hash that signifies the balance information about one account involved in one transaction in the block. The balance hash and block hash are tied together.

To verify any one transaction, the network combines the two children nodes to get the parent node hash. It repeats that process working up the tree. If the final combined hash matches the root hash of the block, the transaction can be verified. But with traditional cryptocurrencies, users must store the entire tree structure.

With Vault, the researchers divide the Merkle tree into separate shards assigned to separate groups of users. Each user account only ever stores the balances of the accounts in its assigned shard, as well as root hashes. The trick is having all users store one layer of nodes that cuts across the entire Merkle tree. When a user needs to verify a transaction from outside of their shard, they trace a path to that common layer. From that common layer, they can determine the balance of the account outside their shard, and continue validation normally.

“Each shard of the network is responsible for storing a smaller slice of a big data structure, but this small slice allows users to verify transactions from all other parts of network,” Leung says.

Additionally, the researchers designed a novel scheme that recognizes and discards from a user’s assigned shard accounts that have had zero balances for a certain length of time. Other cryptocurrencies keep all empty accounts, which increase data storage requirements while serving no real purpose, as they don’t need verification. When users store account data in Vault, they ignore those old, empty accounts.


View the Original Article . . .

{authorlink}
https://news.google.com/news/rss/headlines/section/q/cryptocurrency/cryptocurrency?ned=us&hl=en&gl=US

“cryptocurrency” – Google News
Google News

Greenland ice melting four times faster than in 2003: Southwest part of the island could be major contributor to sea level rise

Greenland ice melting four times faster than in 2003: Southwest part of the island could be major contributor to sea level rise

Greenland is melting faster than scientists previously thought — and will likely lead to faster sea level rise — thanks to the continued, accelerating warming of the Earth’s atmosphere, a new study has found.

Scientists concerned about sea level rise have long focused on Greenland’s southeast and northwest regions, where large glaciers stream iceberg-sized chunks of ice into the Atlantic Ocean. Those chunks float away, eventually melting. But a new study published Jan. 21 in the Proceedings of the National Academy of Sciences found that the largest sustained ice loss from early 2003 to mid-2013 came from Greenland’s southwest region, which is mostly devoid of large glaciers.

“Whatever this was, it couldn’t be explained by glaciers, because there aren’t many there,” said Michael Bevis, lead author of the paper, Ohio Eminent Scholar and a professor of geodynamics at The Ohio State University. “It had to be the surface mass — the ice was melting inland from the coastline.”

That melting, which Bevis and his co-authors believe is largely caused by global warming, means that in the southwestern part of Greenland, growing rivers of water are streaming into the ocean during summer. The key finding from their study: Southwest Greenland, which previously had not been considered a serious threat, will likely become a major future contributor to sea level rise.

“We knew we had one big problem with increasing rates of ice discharge by some large outlet glaciers,” he said. “But now we recognize a second serious problem: Increasingly, large amounts of ice mass are going to leave as meltwater, as rivers that flow into the sea.”

The findings could have serious implications for coastal U.S. cities, including New York and Miami, as well as island nations that are particularly vulnerable to rising sea levels.

And there is no turning back, Bevis said.

“The only thing we can do is adapt and mitigate further global warming — it’s too late for there to be no effect,” he said. “This is going to cause additional sea level rise. We are watching the ice sheet hit a tipping point.”

Climate scientists and glaciologists have been monitoring the Greenland ice sheet as a whole since 2002, when NASA and Germany joined forces to launch GRACE. GRACE stands for Gravity Recovery and Climate Experiment, and involves twin satellites that measure ice loss across Greenland. Data from these satellites showed that between 2002 and 2016, Greenland lost approximately 280 gigatons of ice per year, equivalent to 0.03 inches of sea level rise each year. But the rate of ice loss across the island was far from steady.

Bevis’ team used data from GRACE and from GPS stations scattered around Greenland’s coast to identify changes in ice mass. The patterns they found show an alarming trend — by 2012, ice was being lost at nearly four times the rate that prevailed in 2003. The biggest surprise: This acceleration was focused in southwest Greenland, a part of the island that previously hadn’t been known to be losing ice that rapidly.

Bevis said a natural weather phenomenon — the North Atlantic Oscillation, which brings warmer air to West Greenland, as well as clearer skies and more solar radiation — was building on man-made climate change to cause unprecedented levels of melting and runoff. Global atmospheric warming enhances summertime melting, especially in the southwest. The North Atlantic Oscillation is a natural — if erratic — cycle that causes ice to melt under normal circumstances. When combined with man-made global warming, though, the effects are supercharged.

“These oscillations have been happening forever,” Bevis said. “So why only now are they causing this massive melt? It’s because the atmosphere is, at its baseline, warmer. The transient warming driven by the North Atlantic Oscillation was riding on top of more sustained, global warming.”

Bevis likened the melting of Greenland’s ice to coral bleaching: Once the ocean’s water hits a certain temperature, coral in that region begins to bleach. There have been three global coral bleaching events. The first was caused by the 1997-98 El Niño, and the other two events by the two subsequent El Niños. But El Niño cycles have been happening for thousands of years — so why have they caused global coral bleaching only since 1997?

“What’s happening is sea surface temperature in the tropics is going up; shallow water gets warmer and the air gets warmer,” Bevis said. “The water temperature fluctuations driven by an El Niño are riding this global ocean warming. Because of climate change, the base temperature is already close to the critical temperature at which coral bleaches, so an El Niño pushes the temperature over the critical threshold value. And in the case of Greenland, global warming has brought summertime temperatures in a significant portion of Greenland close to the melting point, and the North Atlantic Oscillation has provided the extra push that caused large areas of ice to melt.”

Before this study, scientists understood Greenland to be one of the Earth’s major contributors to sea-level rise — mostly because of its glaciers. But these new findings, Bevis said, show that scientists need to be watching the island’s snowpack and ice fields more closely, especially in and near southwest Greenland.

GPS systems in place now monitor Greenland’s ice margin sheet around most of its perimeter, but the network is very sparse in the southwest, so it is necessary to densify the network there, given these new findings.

“We’re going to see faster and faster sea level rise for the foreseeable future,” Bevis said. “Once you hit that tipping point, the only question is: How severe does it get?”

View the Original Article . . .

{authorlink}
https://www.sciencedaily.com/rss/top/environment.xml Top Environment News — ScienceDaily

Top stories featured on ScienceDaily’s Plants & Animals, Earth & Climate, and Fossils & Ruins sections.

https://www.sciencedaily.com/images/scidaily-logo-rss.png

Oceans are warming even faster than previously thought: Recent observations show ocean heating in line with climate change models

Oceans are warming even faster than previously thought: Recent observations show ocean heating in line with climate change models

Berkeley — Heat trapped by greenhouse gases is raising ocean temperatures faster than previously thought, concludes an analysis of four recent ocean heating observations. The results provide further evidence that earlier claims of a slowdown or “hiatus” in global warming over the past 15 years were unfounded.

“If you want to see where global warming is happening, look in our oceans,” said Zeke Hausfather, a graduate student in the Energy and Resources Group at the University of California, Berkeley, and co-author of the paper. “Ocean heating is a very important indicator of climate change, and we have robust evidence that it is warming more rapidly than we thought.”

Ocean heating is critical marker of climate change because an estimated 93 percent of the excess solar energy trapped by greenhouse gases accumulates in the world’s oceans. And, unlike surface temperatures, ocean temperatures are not affected by year-to-year variations caused by climate events like El Nino or volcanic eruptions.

The new analysis, published Jan. 11 in Science, shows that trends in ocean heat content match those predicted by leading climate change models, and that overall ocean warming is accelerating.

Assuming a “business-as-usual” scenario in which no effort has been made to reduce greenhouse gas emissions, the Coupled Model Intercomparison Project 5 (CMIP5) models predict that the temperature of the top 2,000 meters of the world’s oceans will rise 0.78 degrees Celsius by the end of the century. The thermal expansion caused by this bump in temperature would raise sea levels 30 centimeters, or around 12 inches, on top of the already significant sea level rise caused by melting glaciers and ice sheets. Warmer oceans also contribute to stronger storms, hurricanes and extreme precipitation.

“While 2018 will be the fourth warmest year on record on the surface, it will most certainly be the warmest year on record in the oceans, as was 2017 and 2016 before that,” Hausfather said. “The global warming signal is a lot easier to detect if it is changing in the oceans than on the surface.”

The four studies, published between 2014 and 2017, provide better estimates of past trends in ocean heat content by correcting for discrepancies between different types of ocean temperature measurements and by better accounting for gaps in measurements over time or location.

“The Intergovernmental Panel on Climate Change’s (IPCC) Fifth Assessment Report, published in 2013, showed that leading climate change models seemed to predict a much faster increase in ocean heat content over the last 30 years than was seen in observations,” Hausfather said. “That was a problem, because of all things, that is one thing we really hope the models will get right.”

“The fact that these corrected records now do agree with climate models is encouraging in that is removes an area of big uncertainty that we previously had,” he said.

Deep Divers

A fleet of nearly 4,000 floating robots drift throughout the world’s oceans, every few days diving to a depth of 2000 meters and measuring the ocean’s temperature, pH, salinity and other bits of information as they rise back up. This ocean-monitoring battalion, called Argo, has provided consistent and widespread data on ocean heat content since the mid-2000s.

Prior to Argo, ocean temperature data was sparse at best, relying on devices called expendable bathythermographs that sank to the depths only once, transmitting data on ocean temperature until settling into watery graves.

Three of the new studies included in the Science analysis calculated ocean heat content back to 1970 and before using new methods to correct for calibration errors and biases in the both the Argo and bathythermograph data. The fourth takes a completely different approach, using the fact that a warming ocean releases oxygen to the atmosphere to calculate ocean warming from changes in atmospheric oxygen concentrations, while accounting for other factors, like burning fossil fuels, that also change atmospheric oxygen levels.

“Scientists are continually working to improve how to interpret and analyze what was a fairly imperfect and limited set of data prior to the early 2000s,” Hausfather said. “These four new records that have been published in recent years seem to fix a lot of problems that were plaguing the old records, and now they seem to agree quite well with what the climate models have produced.”

View the Original Article . . .

{authorlink}
https://www.sciencedaily.com/rss/top/environment.xml Top Environment News — ScienceDaily

Top stories featured on ScienceDaily’s Plants & Animals, Earth & Climate, and Fossils & Ruins sections.

https://www.sciencedaily.com/images/scidaily-logo-rss.png

Furrion’s electric cooler keeps drinks cold for up to a week

Furrion’s electric cooler keeps drinks cold for up to a week


Engadget

Nothing ruins a tailgate faster than lukewarm beers, but with Furrion’s Rova electric cooler, that won’t be an issue — even if your party rages for a week.

The Rova can hold more than two dozen cans in its 1.3 cubic liter chiller compartment and keep them cold for up to seven days without the need for additional ice. What’s more, if you fill the compartment entirely with ice, the Rova will prevent it from melting for up to a fortnight, though its endurance depends on whether you set the the cooler to beverage, freezer or eco mode.

Furrion Rova

The system is powered by a 400 watt lithium power pack but can also carry a second one in reserve. If you manage to run through both cells while on the road, the cells can easily (albeit slowly) recharged either with a standard solar panel or through your vehicle’s cigarette lighter. The Rova also includes a Qi charging pad and three USB ports to keep the rest of your electronics topped off throughout the party. An integrated LCD display enables you to keep an eye on its status and remaining power levels. And don’t worry about mixing electricity with inclement weather. The Rova can stand up to rain sleet and snow better than your local mail carrier.

The Rova will be available for sale this June for $800. An array of accessories — from external drink holders to an integrated umbrella and even a Bluetooth speaker system — will also be available but sold separately.

Follow all the latest news from CES 2019 here!

View the Original Article . . .

{authorlink}
https://www.engadget.com/rss.xml Engadget RSS Feed

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics

https://www.blogsmithmedia.com/www.engadget.com/media/feedlogo.gif?cachebust=true

Why is sea level rising faster in some places along the US East Coast than others?

Why is sea level rising faster in some places along the US East Coast than others?

Sea levels are rising globally from ocean warming and melting of land ice, but the seas aren’t rising at the same rate everywhere. Sea levels have risen significantly faster in some U.S. East Coast regions compared to others. A new study led by the Woods Hole Oceanographic Institution (WHOI) reveals why.

Over the 20th century, sea level has risen about a foot and a half in coastal communities near Cape Hatteras in North Carolina and along the Chesapeake Bay in Virginia. In contrast, New York City and Miami have experienced about a 1-foot rise over the same period, while sea levels farther north in Portland, Maine, rose only about half a foot.

The reason is a phenomenon called “post-glacial rebound,” explains Chris Piecuch, lead author of a study published on Dec. 20, 2018, in the journal Nature. Essentially, land areas in the Northern Hemisphere that once were covered by mammoth ice sheets during the last Ice Age — such as Canada and parts of the Northeast U.S. — were weighed down like a trampoline with a boulder on it. At the same time, land around the periphery of the ice sheets — along the U.S. mid-Atlantic coast, for example — rose up. As the ice sheets melted from their peak at the Last Glacial Maximum 26,500 years ago, the weighed-down areas gradually rebounded, while the peripheral lands started sinking, creating sort of a see-saw effect. Even though the ice sheets had disappeared by 7,000 years ago, the see-sawing of post-glacial rebound continues to this day.

To explore why sea levels rose faster during the last century in areas such as Norfolk Naval Station in Virginia and the Outer Banks in North Carolina, Piecuch and colleagues gathered tidal gauge measurements of sea levels, GPS satellite data that show how much the land has moved up and down over time, and fossils in sediment from salt marshes, which record past coastal sea levels. They combined all of this observational data with complex geophysical models — something that has not been done before — to give a more complete view of sea level changes since 1900.

The research team found that post-glacial rebound accounted for most of the variation in sea level rise along the East Coast. But, importantly, when that factor was stripped away, the researchers found that “sea level trends increased steadily from Maine all the way down to Florida,” Piecuch said.

“The cause for that could involve more recent melting of glaciers and ice sheets, groundwater extraction and damming over the last century,” Piecuch says. “Those effects move ice and water mass around at Earth’s surface, and can impact the planet’s crust, gravity field and sea level.”

“Post-glacial rebound is definitely the most important process causing spatial differences in sea level rise on the U.S. East Coast over the last century. And since that process plays out over millennia, we’re confident projecting its influence centuries into the future,” Piecuch explains. “But regarding the mass redistribution piece of the puzzle, we’re less certain how that’s going to evolve into the future, which makes it much more difficult to predict sea level rise and its impact on coastal communities.”

Story Source:

Materials provided by Woods Hole Oceanographic Institution. Note: Content may be edited for style and length.

View the Original Article . . .

{authorlink}
https://www.sciencedaily.com/rss/top/environment.xml Top Environment News — ScienceDaily

Top stories featured on ScienceDaily’s Plants & Animals, Earth & Climate, and Fossils & Ruins sections.

https://www.sciencedaily.com/images/scidaily-logo-rss.png

Mobile internet is faster than WiFi in 33 countries

Mobile internet is faster than WiFi in 33 countries


Eric Lafforgue/Art In All Of Us/Corbis via Getty Images

It’s tempting to assume that a good WiFi hotspot will outpace modern cellular data, but that’s not necessarily true — in some countries, WiFi might be more of a pain. OpenSignal has conducted a study showing that mobile data is faster on average than WiFi hotspots in 33 countries, including multiple African, European, Latin American and Middle Eastern nations. And the differences are sometime gigantic. You’ll typically have an advantage of 10Mbps or more in places like Australia, Oman and the Czech Republic, while multi-megabit advantages are common in places like Austria, Iran and South Africa.

There are many countries where cellular and WiFi links are roughly competitive. And not surprisingly, WiFi has a clear advantage countries where home broadband is relatively fast, such as Hong Kong, Singapore, South Korea and the US. However, LTE provides a consistent edge for download speeds in some areas — in Lebanon, your downstream speeds tend to be 25Mbps faster than on WiFi.

The findings led OpenSignal to suggest that users and device makers alike need to rethink the assumption that WiFi is usually best. While that might have been true when smartphones were young, it’s not so true any more in the LTE era — and WiFi has its own problems, such as overcrowded networks. This is before 5G promises gigabit-class speeds for some users, too. While WiFi has its purposes (both for local networking and for places with data caps), you might find yourself sticking to cellular access more often going forward.

Countries where mobile data is typically faster than WiFi

View the Original Article . . .

{authorlink}
https://www.engadget.com/rss.xml Engadget RSS Feed

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics

https://www.blogsmithmedia.com/www.engadget.com/media/feedlogo.gif?cachebust=true

NTT DoCoMo and Mitsubishi hit fastest 5G in-car speeds yet

NTT DoCoMo and Mitsubishi hit fastest 5G in-car speeds yet


NTT DoCoMo

With the first 5G phones expected in 2019, carriers in the US and abroad are busy laying the groundwork for faster data speeds. Japan’s three biggest mobile networks are aiming for a similar timeframe, a year ahead of their original 2020 schedule, to coincide with the Rugby World Cup. One of the country’s leading carriers, NTT DoCoMo, has now announced a 5G milestone with the help of Mitsubishi Electric. Together they hit 27Gbps during outdoor trials in Japan’s Kanagawa Prefecture.

According to NTT DoCoMo, it marked the world’s first 5G transmission to exceed a peak speed of 20Gbps using one terminal, with a communication distance of 10 metres attaining the 27Gbps speed, and a speed of 25Gbps over 100 metres.

The trials utilized 16-beam spatial multiplexing in line-of-sight conditions, where massive-element base-station antennas on the wall of a building channelled beams to mobile-terminal antennas installed on the rooftop of a Mitsubishi vehicle, according to Mobile World Live.

They claim the same method could be used to transmit high-speed connectivity to vehicles with multiple passengers like trains and buses. The two companies said they achieved the high 5G speeds by building more advanced beam-forming technology.

But NTT DoCoMo isn’t alone in chasing 5G in-car speeds. Last year, mobile carrier Softbank installed 5G base stations at Honda’s test course in Haikkado — the northernmost prefecture in Japan — to test smart car connectivity. It also began trialling 5G as far back as 2016 in conjunction with Ericcsson, while NTT DoCoMo has partnered with Huawei on its own tests, and fellow carrier KDDI tapped Samsung for its 5G field trials earlier this year.

View the Original Article . . .

{authorlink}
https://www.engadget.com/rss.xml Engadget RSS Feed

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics

https://www.blogsmithmedia.com/www.engadget.com/media/feedlogo.gif?cachebust=true

Apple adds faster AMD Vega graphics options for 15-inch MacBook Pro

Apple adds faster AMD Vega graphics options for 15-inch MacBook Pro


Apple

Apple has acted on its promise to give the 2018 MacBook Pro a much-appreciated graphics performance boost. You can now configure the higher-end 15-inch laptop with Radeon Pro Vega 16 or 20 GPUs that, if you ask Apple, deliver up to 60 percent faster processing power for tasks like 3D modeling and GPU-accelerated video edits. Both options come with 4GB of memory, so your choice boils down to the level of computational power you want.

Get ready to pay a premium if you do like either video chip. In addition to having to buy a higher-end MacBook Pro, you’ll pay $250 more for the Vega 16 and $350 more for the Vega 20. That raises the minimum price for a Vega-equipped Pro to $3,049 — this is really for creatives, not enthusiasts hoping to squeeze higher frame rates out of Fortnite. Nonetheless,this is very welcome if you thought the Radeon Pro 500-series didn’t cut the mustard for a portable workstation.

View the Original Article . . .

{authorlink}
https://www.engadget.com/rss.xml Engadget RSS Feed

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics

https://www.blogsmithmedia.com/www.engadget.com/media/feedlogo.gif?cachebust=true

Google’s new ‘Squoosh’ app is designed to optimize your images

Google’s new ‘Squoosh’ app is designed to optimize your images


Google

Aside from clamping down on deceptive websites, Google is also looking to make the web faster by taking the fight to cumbersome images. Cue a collective cheer from netizens everywhere. To that end, Google Chrome Labs has designed a new web tool called Squoosh that lets devs compress and reformat pics. The app taps WebAssembly to quickly squash down images using a bunch of codecs and is available on all browsers, though (unsurprisingly) it works best on Chrome.

You can access Squoosh online through its lightweight website and, once loaded, it can also work offline within the browser. The web-based app is relatively straightforward to use: it supports a range of web formats such as JPG, MozJPEG, WebP and PNG. It also shows you a 1:1 visual comparison of the original image and the compressed version, to help illustrate the differences. When you’re done editing, just tap download to save the image locally. Squoosh is an open-source tool, so if you’re interested in its inner-workings you can peep its code on GitHub.

Still, the irony of Google obsessing over page loading times while ignoring how bloated Chrome is won’t be lost on many users. Elsewhere, the big G is also campaigning to turn its fast-loading mobile web pages into a web standard — in the hopes that they’ll expand beyond the handful of current adopters (including itself, Twitter, Bing and Baidu) and bring the entirety of the web into its aegis. Eradicating notoriously bulky images is part and parcel of that masterplan.

View the Original Article . . .

{authorlink}
https://www.engadget.com/rss.xml Engadget RSS Feed

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics

https://www.blogsmithmedia.com/www.engadget.com/media/feedlogo.gif?cachebust=true

Mercedes’ AI research could mean faster package delivery

Mercedes’ AI research could mean faster package delivery

It looked and worked like a Ferris wheel and it seemed like a good idea. Packages would sit in the basket and rotate for easy access when about to be delivered. Then a gallon of milk got caught in a crossbeam and it exploded all over the back of the van. That system was scrapped.

Gallery: Mercedes CoROS delivery system | 11 Photos

A system that weighed packages also didn’t really pan out. Turns out, things (such as Diet Coke and regular Coke) weigh the same. Both of these ideas are part of a series of failures that didn’t plague the Mercedes Cargo Recognition and Organization System (CoROS) team. Instead, the researchers moved forward using data from their experiments and information from the folks who actually have to deliver boxes.Mercedes CoROSThe latest plan throws out the idea to redesign the inside of a delivery van with spinning wheels and sensor-filled shelves and instead relies on cameras and AI.

As a box is placed in the van, cameras on the ceiling scan the barcodes, then CoROS determines the optimal shelf segment to place the package and the driver is directed to that area by blue LEDs.

So without whipping out a scanner gun or even breaking their stride, the delivery driver has logged a package onto their truck and figured out the best location to store it until it’s time to deliver it. At that point, the shelf with the package will once again light up so the delivery driver doesn’t have to remember where they put it.

Mercedes CoROS

Once the item is scanned onto the truck it’s also tracked by the array of cameras utilizing infrared sensors for low-light situations. The system creates a 3D version of the van to help it find moved items. If the van makes a quick stop and the box moves (as they tend to do), CoROS updates the location. As the tracked item leaves the vehicle, it’s scanned once again and that information is shared with the entire shipping system.

If a driver tries to put a box in a van that’s not supposed to be there, an alarm sounds and the back of the vehicle glows red. Possibly a bit of overkill for a deterrent, but if you’ve ever had to wait an extra day for an item because it was put on the wrong truck, you’ll appreciate the lengths the CoROS team has gone to make sure that won’t happen.

Using a combination of computer vision and AI, Mercedes believes it can help reduce the time needed to load packages onto vehicles and make sure said boxes are in the optimal spot for an AI-created delivery route. More importantly to delivery companies, the system is cost effective.

Mercedes CoROS

All the hardware is off the shelf and it’s the onboard computer and its communication with the backend that does all the hard work. Mercedes says that it has an API that would help a delivery company plug into its system. The team sees this as a solution not so much for FedEx and UPS, but for smaller delivery companies, typically the ones hired by companies like Amazon to deliver our goods.

As we purchase many of our items from online retailers, the number of deliveries being sent to our homes has skyrocketed. The result is more delivery vehicles on the road. Being able to optimize the time those trucks and vans means they can either deliver more during a day’s run or at least spend less time on the road. That’s good news for companies, traffic and environment. Less time on the road means fewer pollutants spewed into the sky.

Mercedes CoROS

Mercedes says that this system is out in the wild right now. It has teamed up with an unnamed courier and will be scaling that field test up early next year. The CoROS team will then take the information they learn from that to make changes to the system before a commercial launch. It’s gotta be better than the Ferris wheel idea. It’s not like cameras can blow up a gallon of milk.

View the Original Article . . .

{authorlink}
https://www.engadget.com/rss.xml Engadget RSS Feed

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics

https://www.blogsmithmedia.com/www.engadget.com/media/feedlogo.gif?cachebust=true