Human footprint driving mammal extinction crisis

Human impacts are the biggest risk factor in the possible extinction of a quarter of all land-based mammals, according to a University of Queensland study.

Human footprint driving mammal extinction crisis
The Malay Tapir has moved from Vulnerable to Endangered on the IUCN Red List, due to large-scale
deforestation associated with increased hunting [Credit: Tambako]

Researchers compared a 16-year trend in the global human footprint with the extinction risk of around 4500 land-based mammal species.

UQ School of Earth and Environmental Sciences Adjunct Fellow Dr. Moreno Di Marco said the analysis redefined how we looked at mammal extinctions.

“We live in an era when one in every four mammal species is at risk of going extinct,” he said.

“But with more than 5600 mammal species globally, it’s time consuming and expensive to track the changes for every species. To get a clearer idea of what’s systematically leading to these declines, we decided to combine mapping of human pressures with extinction risk assessment data for mammal species.”

The researchers found that human footprint was linked strongly to extinction risk change for land-based mammals – more than any other variable they tested.

Human footprint driving mammal extinction crisis
Anthropogenic modification of natural habitats is the main driver of mammal species decline globally
[Credit: University of Queensland]

“Human impacts in areas originally in a natural or semi-natural state – those with a footprint of only three or below on a zero-to-50 scale – were the main driver of extinction risk change in mammal species,” Dr. Di Marco said.

“In terms of conservation efforts, it makes us look twice at what high-impact human activities really are, since even seemingly low-level impacts are decimating species.”

UQ’s Professor James Watson said the findings were invaluable for future conservation efforts.

“What we’ve created has huge potential to provide rapid assessment of species extinction risk, without having to go through extensive expert consultation every time,” he said.

“It has the potential to change how we assess biodiversity conservation status globally.

“The international community has a mission to prevent the decline of species, and this research will assist in the critical job of prioritising actions for minimising species extinction risk.

“They need to see the big picture, before it’s too late.”

The study has been published in Nature Communications.

Source: University of Queensland [November 09, 2018]

TANN

Archive

2018 November 9 Little Planet Lookout Image Credit &…

2018 November 9

Little Planet Lookout
Image Credit & Copyright: Gyorgy Soponyai

Explanation: Don’t panic. This little planet projection looks confusing, but it’s actually just a digitally warped and stitched, nadir centered mosaic of images that covers nearly 360×180 degrees. The images were taken on the night of October 31 from a 30 meter tall hill-top lookout tower near Tatabanya, Hungary, planet Earth. The laticed lookout tower construction was converted from a local mine elevator. Since planet Earth is rotating, the 126 frames of 75 second long exposures also show warped, concentric star trails with the north celestial pole at the left. Of course at this location the south celestial pole is just right of center but below the the little planet’s horizon.

∞ Source: apod.nasa.gov/apod/ap181109.html

Spacetime: a creation of well-known actors?

Most physicists believe that the structure of spacetime is formed in an unknown way in the vicinity of the Planck scale, i.e. at distances close to one trillionth of a trillionth of a metre. However, careful considerations undermine the unambiguity of this prediction. There are quite a few arguments in favour of the fact that the emergence of spacetime may occur as a result of processes taking place much “closer” to our reality: at the level of quarks and their conglomerates.

Spacetime -- a creation of well-known actors?
Just as the interactions between sand grains form a smooth surface on the beach, the spacetime known
to us could be the result of relations between quarks and their conglomerates [Credit: IFJ PAN]

What is spacetime? The absolute, unchanging, ever- and omni-present arena of events? Or perhaps it is a dynamic creation, emerging in some way on a certain scale of distance, time or energy? References to the absolute are not welcome in today’s physics. It is widely believed that spacetime is emergent. It is not clear, however, where the process of its emergence takes place. The majority of physicists tend to suppose that spacetime is created on the Planck scale, at distances close to one trillionth of a trillionth of a metre (~10-35 m). In his article in Foundations of Science, Professor Piotr Zenczykowski from the Institute of Nuclear Physics of the Polish Academy of Sciences (IFJ PAN) in Cracow systematizes the observations of various authors on the formation of spacetime, and argues that the hypothesis about its formation at the scale of quarks and hadrons (or quark aggregates) is quite sensible for a number of reasons.
Questions about the nature of space and time have puzzled humanity since at least antiquity. Are space and time separated from matter, creating a “container” for motions and events occurring with the participation of particles, as Democrit proposed in the 5th century BC? Or perhaps they are attributes of matter and could not exist without it, as Aristotle suggested a century later? Despite the passage of millennia, these issues have not been resolved yet. Moreover, both approaches – albeit so contradictory! – are deeply ingrained into the pillars of modern physics. In quantum mechanics, events take place in a rigid arena with uniformly flowing time. Meanwhile, in the general theory of relativity, matter deforms elastic spacetime (stretching and twisting it), and spacetime tells particles how to move. In other words, in one theory the actors enter an already prepared stage to play their roles, while in the other they create the scenography during the performance, which in turn influences their behaviour.

In 1899, German physicist Max Planck noticed that with certain combinations of some constants of nature, very fundamental units of measurement could be obtained. Only three constants – the speed of light c, the gravitational constant G and Planck’s constant h – were sufficient to create units of distance, time and mass, equal (respectively) to 1.62 · 10-35 m, 5.39 · 10-44 s and 2.18 · 10-5 g. According to today’s mainstream belief, spacetime would be created at Planck’s length. In fact, there are no substantive arguments for the rationality of this hypothesis.

Both our most sophisticated experiments and theoretical descriptions reach the scale of quarks, i.e. the level of 10-18 m. So how do we know that along the way to Planck’s length – over a dozen consecutive, ever smaller orders of magnitude – spacetime retains its structure? In fact, we are not even sure if the concept of spacetime is rational at the level of hadrons! Divisions cannot be carried out indefinitely, because at some stage the question of the next smaller part simply ceases to make sense. A perfect example here is temperature. This concept works very well on a macro scale, but when, after subsequent divisions of matter, we reach the scale of individual particles, it loses its raison d’etre.

“At present, we first seek to construct a quantized, discrete spacetime, and then ‘populate’ it with discrete matter. However, if spacetime was a product of quarks and hadrons, the dependence would be reversed: the discrete character of matter should then enforce the discreteness of spacetime!” says Prof. Zenczykowski, and adds: “Planck was guided by mathematics. He wanted to create units from the fewest constants possible. But mathematics is one thing, and the relationship with the real world is another. For example, the value of Planck’s mass seems suspicious. One would expect it to have a value rather more characteristic for the world of quanta. In the meantime, it corresponds to approximately 1/10 of the mass of a flea, which is most certainly a classical object.”

Since we want to describe the physical world, we should lean towards physical rather than mathematical arguments. And so, when using Einstein’s equations we describe the Universe at large scales, and it becomes necessary to introduce an additional gravitational constant, known as the cosmological constant Lambda. If, therefore, while constructing fundamental units, we expand the original set of three constants by Lambda, in the case of masses we obtain not one but three fundamental values: 1.39 · 10-65 g, 2.14 · 1056 g, and 0.35 · 10-24 g. The first of these could be interpreted as a quantum of mass, the second is at the level of the mass of the observable Universe, and the third is similar to the masses of hadrons (for example, the mass of a neutron is 1.67 · 10-24 g). Similarly, after taking Lambda into account, a unit of distance of 6.37 · 10-15 m appears, very close to the size of hadrons.

“Playing games with constants, however, can be risky because a lot depends on which constants we choose. For example, if spacetime was indeed a product of quarks and hadrons, then its properties, including the velocity of light, should also be emergent. This in turn means that the velocity of light should not be among the basic constants,” notices Prof. Zenczykowski.

Another factor in favour of the formation of spacetime at the scale of quarks and hadrons are the properties of the elementary particles themselves. For example, the Standard Model does not explain why there are three generations of particles, where their masses come from, or why there are so-called internal quantum numbers, which include isospin, hypercharge and colour. In the picture presented by Prof. Zenczykowski these values can be linked to a certain six-dimensional space created by the positions of particles and their momenta. The space thus constructed assigns the same importance to the positions of particles (matter) and their movements (processes). It turns out that the properties of masses or internal quantum numbers can then be a consequence of the algebraic properties of 6D space. What’s more, these properties would also explain the inability to observe free quarks.

“The emergence of spacetime may be associated with changes in the organization of matter occurring at a scale of quarks and hadrons in the more primary, six-dimensional phase space. However, it is not very clear what to do next with this picture. Each subsequent step would require going beyond what we know. And we do not even know the rules of the game that Nature is playing with us, we still have to guess them! However, it seems very reasonable that all constructions begin with matter, because it is something physical and experimentally available. In this approach, spacetime would only be our idealization of relations among elements of matter,” sums up Prof. Zenczykowski.

Source: The Henryk Niewodniczanski Institute of Nuclear Physics [November 09, 2018]

TANN

Archive