Tag Archives: suggest

Rumors Suggest Nvidia Might Re-Launch RTX 2060, RTX 2060 Super

This site may earn affiliate commissions from the links on this page. Terms of use.

Nvidia announced its RTX 3060 during CES last week, but according to one report, the company has actually restarted production of its RTX 2060 and RTX 2060 Super. If true, it would mean Nvidia doesn’t think it can alleviate the graphics card shortage quickly enough if it relies solely on 7nm GPUs.

The rumor comes from French site Overclocking.com, which claims to gotten confirmation from several brands. Reportedly, Nvidia shipped out a new set of RTX 2060 and 2060 Super GPUs to re-enable the manufacture of these cards. If true, Nvidia could potentially alleviate the GPU shortage by relying on TSMC’s older (and presumably, less-stressed) 12nm product line.

Nvidia showed the following slide during the RTX 3060 launch. It gives some idea how the two compare, though it does not look as though DLSS is being used for the RTX 2060, and there’s no 2060 Super.

Nvidia’s published claims about RTX 3060 versus 2060 performance. Remember, DLSS is enabled on some RTX 3060 benchmarks.

Either way, there should be some room in the product market beneath the RTX 3060 to carve out space for the 2060, 2060 Super, or both.

How’d We Get Here, Anyway?

We’re in this position today because Nvidia wanted to avoid a repeat of Turing’s disastrous launch. Back in 2018, Nvidia repeatedly told investors that the huge spike in GPU sales through 2017 and into 2018 was being driven by gamers, not by cryptocurrency mining. It’s never been clear how true that was — and Nvidia has been sued by shareholders over the idea that the firm knew full well where its demand was coming from. But whether the company misread the market or not, it appears to have been genuinely caught off-guard when the crypto market cooled off. This left a lot of Pascal GPUs on shelves that had to be moved.

Turing’s second problem was its pricing. Nvidia decided to raise prices with Turing and increased the prices of its GPUs accordingly. It proved unwise to raise Turing prices when Pascal cards were hitting some of the best prices of their lives, and sales of the cards suffered.

Turing’s third problem was that its major feature wasn’t supported in any shipping titles yet. This is not unusual when major new features are introduced to gaming — hardware support has to precede software support, because the arrow of time is annoying and inconvenient — but it still counts as a drag on the overall launch.

This time around, Nvidia wanted to avoid these issues. Turing production was discontinued well before Ampere launched. The end-user community was deeply unhappy with Nvidia’s Turing pricing, and Nvidia, to its credit, adjusted its prices. The non-availability of ray tracing, similarly, is not a problem here. While the number of ray-traced games remains small, there’s now a small collection — including AAA titles — with RTX / DXR support integrated.

Nvidia did everything right, in terms of building appeal for gamers. The one thing it didn’t count on was the impact of the COVID-19 epidemic on semiconductor demand. Bringing back the RTX 2060 and 2060 Super could give Nvidia a way to respond to this problem without sabotaging its new product lineup.

Frankly, it’d be nice to see the RTX 2060 and 2060 Super back in-market, if only to bring a little stability to it. Here are Newegg’s current top-selling GPUs as of 1/20/2021:

It’s not unusual for the Top 10 to have a few cheap cards in it, but every GPU with any horsepower whatsoever is far above retail price.

Newegg’s best-selling GPUs are bottom-end Pascal cards. The last-gen RX 580 and the GTX 1660 Super are the only two consumer cards selling for under $ 500. Both of them are terrible deals at this price point.

There’s always a bunch of low-end garbage stuffed into the GPU market. Typically, these parts live below the $ 100 price point, where you’ll find a smorgasbord of ancient AGP cards, long-vanished GPU micro-architectures, and rock-bottom performance that almost always costs too much. Today, the garbage has flooded into much higher price points. Want a GTX 960? That’ll be $ 150. How about a GTX 460 for $ 145 or an HD 7750 for $ 155? There’s a GTX 1050 Ti for $ 170, which is only $ 40 more than the GPU cost when new, over four years ago.

Right now, it’s impossible to buy any GPU for anything like MSRP. If bringing the RTX 2060 and RTX 2060 Super back to market actually provides some stability and some kind of modern GPU to purchase, I’m in favor of it. At this point, it wouldn’t be the worst thing in the world if AMD threw the old Polaris family back into market, either. While they wouldn’t be a great value at this point ordinarily, the cheapest RX 5500 XT at Newegg is $ 397. Under these circumstances, any midrange GPU manufactured in the last four years that can ship for less than $ 300 would be an improvement.

The past five years have been the worst sustained market for GPUs in the past two decades. Currently, GPU prices have been well above MSRP for 24 out of the past 56 months, dating back to the launch of Pascal in late May, 2016. This isn’t expected to change until March or April at the earliest. When cards aren’t available at MSRP for nearly half the time they’ve been on the market over five years and two full process node deployments, it raises serious issues about whether we can trust MSRPs when making GPU recommendations. Right now, the best price/performance ratio you can get in the retail market might be an RX 550 for $ 122.

The GPU market in its current form is fundamentally broken. Manufacturer MSRPs have the same authority as any random number you might pick out of a hat. There are a lot of factors playing a part in the current situation, including manufacturing yields and COVID-19, but this problem started four years before the pandemic.

AMD and Nvidia need to find a better way to ensure that customers are able to buy the cards they actually want to purchase, or they need to delay their launches for a sufficient length of time as to build up a meaningful stockpile of hardware, sufficient to supply launch demand for a matter of days, not seconds. Alternately, they may need to delay launches until yield percentages and availability are high enough to ensure a constant stream of equipment to buyers.

Right now, we have launch days that sell out instantly and interminable delays between new shipments. If these rumors are true, and we hope they are, Nvidia bringing back the RTX 2060 and 2060 Super will help a little in the short term, but what we obviously need is for AMD and Nvidia to take a fundamentally different approach to product inventory management. As things stand, these aren’t product launches. They’re product teases.

Now Read:

Let’s block ads! (Why?)

ExtremeTechGaming – ExtremeTech

Quebecers need to further reduce contacts to slow spread of COVID-19, projections suggest

Quebecers will need to be more diligent about physical distancing and further reduce their contacts to avoid a rise in the number of COVID-19 cases, hospitalizations and deaths, according to the latest projections by government-affiliated experts. 

The projections suggest that even with the closure of bars and restaurants, the cancellation of organized sports and further restrictions in schools put in place at the beginning of October, those numbers will continue to rise into the New Year.

But if the population reduces its contacts by another 25 per cent, according to one model, by maintaining two metres of physical distance in public spaces, wearing masks and limiting gatherings, the spread of the virus is likely to plateau and even decline. 

The findings were presented Friday by Quebec’s public health research institute, the INSPQ.

The presentation included three mathematical models: one if no measures had been introduced after cases started to climb in the middle of August and September; another if the restrictions imposed in early October are maintained; and a third that showed the impact of reducing contacts. 

Dr. Jocelyne Sauvé, vice-president of scientific affairs at the INSPQ, said the modelling suggests Quebec was headed for a “fairly catastrophic” rise in cases in September had nothing been done, with a death toll that could have exceeded the first wave.

But she said the projections show that further effort from the population will be necessary to stabilize the pandemic.

Marc Brisson, a health economics professor at Université Laval who presented the findings, said in practical terms this means further cutting back on non-essential contacts and keeping two metres apart while, for example, speaking to another parent during school drop off.

All three models were prepared with the assumption that Quebec’s long-term care homes and private seniors’ residences are better protected than they were in the spring.

Brisson said the INSPQ is preparing another round of projections to be released later this fall that would include the impact of more effective testing and contact tracing on the rate of transmission.

The previous round of INSPQ projections was released in July and forecast that a second wave would hit Quebec sometime in September. Its force would depend on how well Quebecers were following health guidelines.

Health Minister Christian Dubé said Friday the projections reinforce what the province has been saying — that the actions of individuals have major consequences.

Dubé also pointed to another study, prepared by Quebec’s health research institute INESSS, that suggested Quebec’s hospitals were under less strain than they had been a week ago.

“We have been doing well on stabilization, but we want those cases to continue to decline. Why do we do this? We want to protect our health system and have as few victims as possible,” he said.

“We would have hit a wall if we didn’t do what we did Oct. 1.”

Let’s block ads! (Why?)

CBC | Health News

Early signs suggest race matters when it comes to COVID-19. So why isn’t Canada collecting race-based data?

How do you solve a problem you can’t see?

That’s the question several researchers and health professionals across the country are pressing Canada to consider as the battle against COVID-19 wages on. The fear: the virus will kill overwhelming numbers of populations we simply aren’t paying attention to.

There’s a blind spot in this country’s approach to combating the virus, these advocates say, and it’s one that for many could make the difference between life or death: race-based data.

“We know that people who are poor, people who are homeless, Indigenous populations and also our refugee, immigrant and racialized populations, they’re more likely to have chronic diseases because chronic diseases go with poverty and they go with low income,” said Dr. Kwame McKenzie of the Toronto-based Wellesley Institute.

When resources are stretched, McKenzie said, people with chronic diseases may not find themselves at the top of the list for intensive care and ventilators. He said that in some cases, people with underlying conditions have been denied access to those resources during the pandemic, with the priority going to those deemed more likely to survive.

“You have to collect the data to do good medicine.”

Canada doesn’t track race or ethnicity as part of its data collection around COVID-19. And that dearth of data has come into sharp focus as Canadians look across the border to their southern neighbour, the United States, which has emerged as the hardest-hit country in the world, with a death toll surpassing 36,000. 

‘No plans’ to collect race-based data in Canada

In parts of the U.S., an overwhelming number of black and Latino residents have died of the virus compared with other groups, even where they are a minority. Consider Chicago, where black residents are 30 per cent of the population but make up more than 70 per cent of the COVID-19-related deaths. 

In New York City, more Latino and black residents have died of the virus than their white or Asian counterparts, according to figures released by New York’s health department, which cautions its statistics are not comprehensive. Indeed, in the 12 states reporting race and ethnicity data around COVID-19, black residents were found to be 2.5 times more likely to die of the virus than the general population, according to the public policy research group APM Research Lab. 

Regardless of race, ethnic or other backgrounds. They’re all equally important to us.– Dr. David Williams

But as to whether Canada intends to collect that sort of data, a spokesperson for chief public health officer Dr. Theresa Tam told CBC News this week, “There are currently no plans to add more social determinants of health (such as education or income) as risk factors to the case reporting form used for the collection of COVID-19 data.”

Asked last week if Ontario planned to collect such data, the province’s chief medical officer of health, Dr. David Williams, replied that the groups identified to be most at risk are the elderly, people with underlying conditions and those with compromised immune systems.

“So those are all priorities to us, regardless of race, ethnic or other backgrounds. They’re all equally important to us,” Williams answered. 

Ontario chief medical officer Dr. David Williams said the groups identified to be most at risk are the elderly, people with underlying conditions and those with compromised immune systems. ‘So those are all priorities to us, regardless of race, ethnic or other backgrounds.’ (Evan Mitsui/CBC)

‘It’s really concerning’

Ontario’s Anti-Racism Act allows the government to mandate race-based data collection across various sectors, but the province has said health-care providers aren’t authorized to do the same because of privacy considerations. 

Williams’s response was met with criticism from several health-care advocates and professionals, including Suzanne Obiorah, the director of primary care at Ottawa’s Somerset West Community Health Centre.

“It’s really concerning. It’s almost like there wasn’t an acknowledgement of existing health disparities,” Obiorah told CBC News. 

“It doesn’t allow us to fully understand the impacts of COVID in vulnerable communities. And then it doesn’t help us to organize ourselves to target vulnerable communities in a focused way.”

Poverty means marginalized groups are more likely to have to continue working through the pandemic, often on the front lines as cleaners, bus drivers and at grocery stores, she and others point out. 

WATCH | A grocery worker in the Queens borough of New York City talks about the risks of doing his job:

CBC’s Susan Ormiston talks to a grocery worker in the Queens borough of New York City about the risks he’s taking on the job. 1:28

With about half of Canada’s COVID-19-related deaths taking place inside care facilities, Obiorah says, federal and provincial governments have been able to rework their approach to prioritize the senior population.

“But what informed us to be able to do that was data,” she said.

Alberta acknowledges some ‘systemically disadvantaged’

Obiorah and a group of black medical professionals are now petitioning for officials to immediately mandate the collection of race and socio-demographic-based data.

“Without an evidence base, the inequitable experiences of marginalized populations are dismissed as anecdotal and interventions are not prioritized,” they said in an open letter to the Ontario government..

Last week, Alberta’s chief public health officer committed to begin looking into race-based data collection. “We know that certain groups of people are systemically disadvantaged,” Dr. Deena Hinshaw said, adding that it could work with First Nations groups to pull specific data from the provincial system. 

During the H1N1 pandemic in 2009, Indigenous people in Canada were six-and-a-half times more likely to end up in intensive care units, Toronto-based pediatric infectious disease specialist Dr. Anna Banerji told CBC News.

Dr. Anna Banerji says, ‘Knowing that these communities are at higher risk for multiple reasons should be a call for action.’

At the time, Health Canada sent dozens of body bags to some of the hardest-hit reserves in Manitoba, as part of a shipment of hand sanitizers and face masks. The federal agency later apologized, but the action left some members of the community feeling they simply weren’t a priority to Canada. 

“Is the body bags a statement from Canada that we as First Nations are on our own?” Wasagamack Chief Jerry Knott asked at the time.

“Knowing that these communities are at higher risk for multiple reasons should be a call for action,” said Banerji.

“We say that this virus affects people equally. It really doesn’t. It affects people who don’t have the resources really to not work and to buy gloves and to take care of themselves.”

Let’s block ads! (Why?)

CBC | Health News

Plastic labelled ‘BPA free’ might not be safe, studies suggest

It’s hard to walk down the kitchenware aisle in a Canadian store without noticing the “BPA-free” labels on plastic bottles and containers.

Consumers usually assume these labels mean products are safer, or better, because they do not contain the harmful chemical bisphenol A (BPA).

BPA is used to manufacture polycarbonate, the hard, clear plastic from which some bottles and other food containers are made. When foods are in direct contact with the plastic, small amounts of BPA may migrate into those foods, prompting increased public pressure to move away from its use.

But new research suggests the chemicals now used as substitutes for BPA, mainly other bisphenols, may have negative health impacts similar to those caused by BPA. Health and environmental advocates are raising questions about the safety of those substitutes.

And the record of what impact such products may have is confused, because there is little information for consumers on what substances are being used to replace BPA.

Chemicals like BPA… are being replaced by what are referred to as ‘regrettable substitutes.’– Muhannad Malas, Environmental Defence

BPA has been used for more than 60 years and is also found in the epoxy resin that lines food cans, as well as items including baby bottles, teething rings, baby clothing, register receipts and dental sealants. 

Manufacturers began using BPA substitutes in response to Canada’s 2010 ban on BPA in baby bottles after an extensive assessment concluded it was toxic. International research found BPA was an endocrine disruptor, capable of interrupting the normal process of human growth and development, and may be linked to poor neurobehavioural functioning, obesity, and cancer

The safety of BPA is still under debate. 

Although the ban was primarily meant to protect infants, it resulted in the widespread introduction of “BPA-free” products, including reusable water bottles and lunch containers. 

Substitutes show up in food, blood

But researchers are worried that the chemicals used to replace BPA, things like bisphenol S, bisphenol F and bisphenol B, are starting to show up in food, house dust, blood and urine. 

These newer chemicals were chosen because they were similar enough to BPA to serve the same function — namely, to produce strong, clear plastics. But growing evidence suggests they may also be endocrine disruptors.

A 2019 study in the journal Toxicology reviewed hundreds of studies on two dozen different BPA substitutes and concluded that some “have health or toxicological effects at concentrations similar to or lower than BPA.” 

Then-federal health minister Tony Clement announces a plan in 2008 to ban the import and sale of plastic baby bottles containing BPA. Although the ban was primarily meant to protect infants, it resulted in the widespread introduction of ‘BPA-free’ products, including reusable water bottles and lunch containers.  (Chris Wattie/Reuters)

In other words, these chemicals may have the same harmful effects as BPA, but at lower levels. Almost all of the BPA substitutes showed some hormonal influence, suggesting they could affect growth and reproduction. 

“This is something that the scientific community has been warning regulators about for a very, very long time,” said Muhannad Malas, toxics program manager at Environmental Defence, an advocacy group.

“Chemicals like BPA, that end up getting phased out through regulation or through voluntary corporate action, are being replaced by what are referred to as ‘regrettable substitutes.'”

Not convinced

Researchers argue that other bisphenols are being used as substitutes in plastic products, based on their growing presence in the environment and in our bodies. Pinpointing the exact source is challenging because companies are not required to list them as ingredients. Plastic is one of the likely sources, but food cans, receipt paper and clothing may also contribute.

Still, Steve Hentges, a senior director at the American Chemistry Council, which represents the chemicals industry, is not convinced.

“There is little evidence or reason to believe that BPA is being replaced with other bisphenols,” he said on behalf of the organization.

The 2019 Toxicology study noted that some BPA substitutes had not been studied, possibly because it was “unclear if they are chemicals in current use.” The authors also noted that there was limited information on other BPA substitutes known to be in use, and stressed the need for more information about the levels of these chemicals in human populations.

Health Canada is aware of these concerns. The department confirmed in an emailed statement that “certain bisphenols have been identified for further scoping, and further information gathering is ongoing.” It added that future releases of the Canadian Health Measures Survey will measure the most common BPA substitutes to assess their impact on Canadians.  

BPA substitutes generally not disclosed

It is not easy for consumers to identify which plastics contain BPA substitutes, because manufacturers generally do not disclose this information on product packaging. “The only way for a concerned parent to verify is by contacting the company and asking the question,” Malas said. “In many cases the company may not even know.”

Identifying the substitutes is further complicated by widespread use of the BPA-free label.

“My suspicion is that the BPA-free label is marketing,” said Erica Phipps, executive director of Canadian Partnership for Children’s Health and Environment, an organization that advocated for BPA to be replaced with safer alternatives, in a 2010 position paper.

“It’s responding to the fact that BPA has now been identified as something that we’d like to avoid exposure to. You’ll see it on categories of products that never had BPA to start with.”

Let’s block ads! (Why?)

CBC | Health News

Leaked GPU Specs Suggest Xbox Series X Substantially More Powerful Than PS5

This site may earn affiliate commissions from the links on this page. Terms of use.

A new set of rumors have leaked regarding the next-generation Xbox and PS5 and the GPUs both consoles will bring to market. We’ve known the broad specs of both platforms for a bit — both use AMD GPUs and CPUs, with the GPU based on AMD’s most recent RDNA architecture, while the CPU is derived from the same 7nm Ryzen CPU cores that launched earlier this summer. What we’ve lacked is specific details on the GPU cores themselves.

Eurogamer has gotten their hands on some leak data they feel is fairly legit, and the website’s track record with this kind of information is solid. There have been some rumored APU configurations that leaked earlier this year, but this new data implies the Sony PS5 will feature 36 GPU clusters clocked at up to 2GHz. Supposedly the silicon, codenamed Oberon, is designed to operate in three different modes (Gen 0, 1, and 2) with clocks of 800MHz, 911MHz, and 2GHz respectively. Supposedly memory bandwidth is 448GB/s in Gen 2 mode (though 512GB/s is an alternate possibility) and the GPU can reportedly also be variably configured in terms of ROP and core counts. Eurogamer states:

While a 2.0GHz GPU clock is used for what is described as the fully unlocked ‘native’ or ‘Gen2’ mode, the processor is also tested in what is referred to as Gen1 and Gen0 modes. The former is explicitly stated as running with 36 compute units, a 911MHz core clock, 218GB/s of memory bandwidth and 64 ROPs – the exact specifications of PlayStation 4 Pro. The latter Gen0 mode cuts the CU and ROP counts in half and runs at 800MHz, a match for the base PS4. The indications are that back-compat is an integral part of the silicon, which in turn raises some interesting questions about the makeup of the Navi GPU and the extent to which older GCN compatibility may be baked into the design.

The implication here is that the PS5 SoC contains multiple GPU clusters, just like the PS4 Pro did. Using multiple GPU clusters in the same SoC would give Sony the same ability to turn the clusters on or off depending on which mode the GPU was running in. Alternately, the GPU cluster could be physically unified but designed to allow for this kind of fine-grained power gating. Stamping out identical clusters would be simpler, designing a unified cluster with fine-grained gating is probably more complex but saves on die space.

As for the Xbox Series X, Eurogamer is implying this console packs serious firepower. Here’s the rumored configuration:

Image by Eurogamer

If this rumor proves true — always something to keep in mind — the Xbox Series X will launch packing the equivalent of a high-end PC GPU. The largest GPU AMD has ever built are cards like the R9 Fury X and Vega 64, with 4096 cores. A 56-cluster Navi GPU would pack 1.4x more GPU cores than the 5700 XT, which already competes in the high-end PC GPU segment at the ~$ 400 price point. While AMD is expected to launch Navi 20 before the Xbox Series X debuts, we haven’t seen any indication that the company intends to dramatically expand the number of GPU cores it offers — Navi improved on GCN’s performance by making the individual cores more efficient as opposed to simply throwing more cores at the problem. It’s highly unlikely, in other words, that AMD would build a 56 CU for Microsoft and then ship a 128 CU design into the PC market.

If this rumor proves true, Microsoft is playing a far more aggressive game than it did last generation. Let’s assume, for the sake of argument, that AMD ships an 80 CU version of Navi 20, which comes out to 2x Navi 10. That would give the Xbox Series X 3,584 GPU cores compared to 5,120 for Navi 20, or about 70 percent as many.

In 2013, the Xbox One shipped with 768 GPU cores. The month before, AMD had shipped the R9 290X, with 2,816 cores. The PS4, at debut, had 1,152 cores. The Xbox had 27 percent as many GPU cores as the R9 290X, while the PS4 had 41 percent. While we can’t draw linear comparisons between console and PC performance strictly on the basis of GPU core count, the PC GPU was obviously far larger, with significantly more compute and graphics resources.

If — again, if — these rumors are true, the gaps are going to be a lot narrower this time around. The 1.7GHz clock speed on the Xbox Series X’s GPU is required to hit a supposed target of 12TFLOPS, but Eurogamer didn’t get that clock speed leak directly. The gap in GPU performance between the PS5 and XSX would be partially offset by faster clocks on the PS5, but only partially.

Frankly, the spec gap between the PS5 and XSX is large enough that you could argue the Xbox specs are less likely to be true. It’s also possible Microsoft decided to pull out all the stops after the disaster of the Xbox One. Doubling down on beating Sony in raw performance from Day 1 might represent Microsoft’s big idea for preventing a repeat of what happened last generation.

If the Xbox rumors are accurate there doesn’t seem to be a way for MS to sell the console at $ 400 without losing money — and I’ve got doubts about $ 500 as well, given that the system is expected to also use a high-speed NVMe-attached SSD and GDDR6. Hard drives might be slow, but they’re still cheaper than the equivalent amount of solid state storage. That doesn’t mean MS can’t pursue a loss-making strategy, but both MS and Sony opted not to do that with the initial Xbox One / PS4 after taking heavier-than-expected losses on X360 and PS3 (particularly in Sony’s case).

This kind of configuration would make a lot more sense if Microsoft is serious about a lower-end version of the console and intends to debut both. The PS5’s smaller GPU looks more like what we’d expect from a generational update. On the other hand, if this points to an upper-end Xbox Series X, it means that version of the console is going to pack high-end* PC-equivalent performance. With a 56-CU Navi, 8-core Ryzen 7nm CPU and 560GB/s of system memory bandwidth there’s no way it could perform like anything else.

Now Read:

Let’s block ads! (Why?)

ExtremeTechExtreme – ExtremeTech

Measles limits immune system’s ability to fight off other infections, studies suggest

The measles virus has not only made a devastating resurgence worldwide, but it may also cripple the immune system’s ability to fight off other infections in the long term, two new studies suggest.

The highly contagious measles virus causes coughing, rashes and fever and can lead to serious complications. Last month, the World Health Organization said reported cases rose 300 per cent globally in the first three months of this year compared with same period in 2018 .

A two-dose vaccine has helped to slash measles cases since 2000, saving an estimated 21.1 million lives between 2000 and 2017, WHO said.

But the rise of anti-vaccination campaigns, non-vaccinating religious communities and other factors have led to outbreaks causing tens of thousands infections in Congo, Madagascar, the Philippines, Sudan, Thailand and Ukraine, among other countries, according to WHO.

Now researchers say measles vaccination not only controls measles, but it also protects the immune system from losing its ability to suppress other infections.

Stephen Elledge, a geneticist at Brigham and Women’s Hospital in Boston and a co-author of one of the papers presenting evidence that the measles virus destroys part of the immune system, compared the damage to a head injury.

An illustration of a spherical-shaped measles virus particle that is highly contagious. (Alissa Eckert/CDC)

“You can be in an accident and you could have a head injury and you could lose your memory,” Elledge said. “The measles virus is like an accident for your immune system. It loses its memory. But we know how to deal and prevent a lot of these head injuries. It’s called seatbelts and vaccines are like a seatbelt for your immune system.”

Normally, when we’re exposed to an infection, the immune system’s antibody factory kicks into high gear to recognize, remember and defend against a pathogen if it comes along again by making lots of fighter cells.

But the two teams of researchers found that after measles infection, the fighter cells, called B cells, seemed to be wiped out for months or even years. Without them, the body is vulnerable to pathogens such as the influenza virus or pneumococcal bacteria that cause ear infections.

The experiments were only possible because 77 families of unvaccinated children in the Netherlands agreed to donate blood samples after they were infected in a measles outbreak in 2013. Researchers compared those samples to ones from healthy volunteers.

The highly contagious measles virus causes coughing, rashes and fever and can lead to serious complications. (U.S. Centre for Disease Control and Prevention)

In infants who were not vaccinated and caught measles, the level of antibodies that could fight off other infections plunged. That didn’t happen in infants who were vaccinated against measles.

At the same time, Colin Russell, a professor of applied evolutionary biology at the Academic Medical Center at the University of Amsterdam, and his team looked at how the immune system loses its ability to fight other infections using another laboratory method. 

Together, the studies offer independent and complementary pieces of evidence of how the immune system can lose its memory, Russell said.

Consequence can last years

Russell said in 10 per cent of the unvaccinated children, one part of the immune system found in the bone marrow — the naïve component — was completely reset. Their immune system reverted to a naïve, infant-like state.

What’s more, the degree of suppression was comparable to taking powerful, immunosuppressive drugs for organ transplants.

“The immune consequence for measles can last for years and all of this really just goes to underscore the importance of measles vaccination because all of these things are entirely preventable just by getting your kids vaccinated,” Russell said.

Dr. Manish Sadarangani, who directs the Vaccine Evaluation Centre at BC Children’s Hospital, wasn’t involved in the research, but called it fascinating and important work that uses cutting-edge techniques to study how the measles virus affects the human immune system.

Previously, evidence on how measles infection suppresses the immune system was just circumstantial, Sadarangani said.

Sadarangani called it “incredible” how measles infection profoundly affects parts of the immune system similar to powerful immunosuppressive drugs.

Sadarangani said when he speaks to parents who have concerns about measles, some ask about complications that mostly occur in the few days after measles infections.

“This adds a whole new dimension,” he said. “It makes the argument to my mind of vaccinating much more powerful because you’re now preventing not just measles but all of these knock-on complications that may be related to measles.”

The researchers said limitations of their work include the small number of subjects from just one population. Some of the authors also serve as advisers to vaccine makers. 

Let’s block ads! (Why?)

CBC | Health News

Russian Investigators Suggest ISS Damage Could Have Been Sabotage

This site may earn affiliate commissions from the links on this page. Terms of use.

Astronauts scrambled last week to find and patch a small hole in the International Space Station (ISS) that threatened to leak the station’s atmosphere into space. The crew eventually discovered a tiny puncture in the Russian Soyuz capsule docked to the station. The hole was first identified as a micrometeoroid puncture, but now that’s looking less likely. Russia suggests this damage was caused either accidentally or on purpose by human hands. Did someone try to sabotage the ISS?

Authorities are adamant that the six-person crew of the ISS was not in danger at any point as they hunted for the leak. The hole caused a drop in cabin pressure, which is still something you want to address even if it’s not imminently deadly. Astronauts patched the hole with a special type of bonding tape and the crisis was averted. Since the damage was in the Russian module, Russia was tasked with the investigation.

At first, everyone seemed content with this explanation — after all, there are many thousands of space junk objects scattered around Earth that could have made a hole that size. But now Russia has called the micrometeoroid cause into question. Russia’s Space agency chief Dmitry Rogozin said on a televised appearance that the damage is not consistent with an impact. He said the hole was from a drill, and that it appears the drill wavered, leaving scuff marks around the hole. NASA deleted the images it posted publicly with the micrometeoroid explanation attached, but they do look sort of like drill holes to the uninformed.


As for whether or not this is a case of sabotage, that depends on how exactly the hole got there. A Russian firm called Energia manufactured Soyuz capsules for the government, and employees in the past have made mistakes that led to similar damage. In one instance, a technician drilled through the hull and attempted to hide the damage with epoxy. However, the damage was detected pre-flight, and the worker was fired.

Some have wondered if a resident of the ISS caused the damage by accident or on purpose, but it’s more likely this hole was present since the capsule was on the ground. It flew to the ISS in June carrying three passengers: Russia’s Sergey Prokopyev, Germany’s Alexander Gerst, and the Serena Auñón-Chancellor of the US. Russian operators did not detect any issues at the time, but the hole may have been patched and later failed in orbit. NASA says it is withholding judgment until the Roscosmos investigatory committee completes its work.

Now read: Boeing and SpaceX Might Not Be Ready for Manned Flights in 2019Floating IBM Robot Ships Out to International Space Station, and International Space Station Soon to Be Coldest Place in Known Universe

Let’s block ads! (Why?)

ExtremeTechExtreme – ExtremeTech

Lawyer who met Trump Jr. worked more closely with Russian officials than thought, docs suggest

The Moscow lawyer said to have promised Donald Trump's presidential campaign some dirt on his Democratic opponent worked more closely with senior Russian government officials than she previously let on, according to documents reviewed by The Associated Press.

Scores of emails, transcripts and legal documents paint a portrait of Natalia Veselnitskaya as a well-connected lawyer who served as a ghostwriter for top Russian government lawyers and received assistance from senior Interior Ministry personnel in a case involving a key client.

The data was obtained through Russian opposition figure Mikhail Khodorkovsky's London-based investigative unit, the Dossier Centre, that is compiling profiles of Russians it accuses of benefiting from corruption.

The Associated Press was unable to reach Veselnitskaya for comment. Messages from a reporter sent to her phone were marked as "read" but weren't returned.

Veselnitskaya has been under scrutiny since it emerged last year that Trump's eldest son, Donald Jr., met with her in June 2016 after being told by an intermediary that she represented the Russian government and was offering Moscow's help defeating rival presidential candidate Hillary Clinton.

U.S. President Donald Trump's son, Donald Trump Jr., speaks at an event in Pennsylvania on March 12. Donald Jr. met with Veselnitskaya in June 2016. (Brendan McDermid/Reuters)

Veselnitskaya has denied acting on behalf of Russian officialdom when she met with the Trump team, telling Congress that she operates "independently of any government bodies."

But the Dossier Centre's documents suggest her ties to Russian authorities are close — and they pull the curtain back on her campaign to overturn the sanctions imposed by the U.S. on Russian officials.

For example, the emails show that Veselnitskaya was mixed up in the Russian government's attempt to extract financial information from the former law firm of Bill Browder, the American-born British businessman who was a longtime critic of the Kremlin.

She's an agent of the Russian government and not an independent lawyer as she claims.– Bill Browder, businessman and Kremlin critic

An Oct. 31, 2017, email shows Veselnitskaya's office preparing a draft version of Russian Deputy General Prosecutor Mikhail Alexandrov's affidavit to Cypriot authorities. "This is needed by tomorrow," she wrote a subordinate.

Two weeks later, a finalized version of the same document was sent by a Russian diplomatic staffer to a Cypriot counterpart, the Dossier Centre's files show.

Browder, who has often clashed with Veselnitskaya in and out of court, said this reinforced the idea that she was enmeshed with Russian officialdom.

"If her office is drafting replies for Russian-Cyprus law enforcement co-operation, in my opinion that effectively shows that she's an agent of the Russian government and not an independent lawyer as she claims," he said in a telephone interview.

Kremlin support

In a written statement, the Russian Embassy in Cyprus called the Associated Press's question a "provocation" and said it had "no idea who is Nataliya Veselnitskaya and what she sends or doesn't send to the Cypriot Officials."

Alexandrov, reached at the prosecutor general's office, refused to speak to AP.

Veselnitskaya appears to have gotten government support too.

When Swiss officials arrived in Moscow in September 2015 to interrogate Denis Katsyv, one of her key clients, they were met not just by Veselnitskaya but by Lt.-Col. A. V. Ranchenkov, a senior Interior Ministry official previously known for his role investigating the Russian punk band Pussy Riot.

Ranchenkov devoted a chunk of the interview to questions about the legality of Browder's actions, according to a transcript of the interrogation reviewed by Associated Press.

The Russian Interior Ministry did not return messages seeking comment.

The emails also show how Veselnitskaya tried to extend her influence to the United States, where she was working to overturn the Magnitsky Act, a sanctions law that was championed by Browder after his lawyer, Sergei Magnitsky, died under suspicious circumstances in a Russian prison.

Moscow responded to the sanctions with a ban on U.S. adoptions of Russian orphans. That prompted lobbyists to court groups such as Families for Russian and Ukrainian Adoption Including Neighbouring Countries, or FRUA, a charity that supports families who adopt children from former Soviet bloc nations. The idea was to use the issue of adoptions to help them reverse the sanctions.

'My antennae were out'

Jan Wondra, FRUA's chairman, said she attended a meeting in Washington on June 8, 2016, with a group of people that included Rinat Akhmetshin, a Russian-American lobbyist who was working with Veselnitskaya to overturn the sanctions.

The group told her there was evidence that the Magnitsky Act was propelled by bogus claims spread by Browder, Wondra said. It promised that the revelation could lead to the overturning of the Russian adoption ban.

Wondra told AP that she was suspicious and feared that the lobbyists wanted FRUA's endorsement for their own purposes.

"My antennae were out. I looked at this as an attempt to put public pressure on Congress to rescind all or a part of the Magnitsky Act," she said, emphasizing that she spoke only for herself, not her organization. "The conclusion I drew was that FRUA should not participate. And we didn't."

Russian-American lobbyist Rinat Akhmetshin, right, is one of the people who participated in the June 2016 meeting between Veselniskaya and Donald Trump Jr., and other Trump campaign officials. (Alex Wong/Getty Images)

Akhmetshin, who would join Veselnitskaya at the Trump Tower meeting the next day, declined comment.

The emails obtained by AP leave some unanswered questions.

In particular, the Dossier Centre's investigation turned up almost no messages about the Trump Tower meeting itself. The group said it received only a few messages dealing with the media queries when the meeting became public in mid-2017.

There's no mention either of the Russian hack-and-leak operation that began rattling the Democrats immediately following Veselnitskaya's visit.

Let’s block ads! (Why?)

CBC | World News

Leaked Benchmarks Suggest Intel Will Drop Hyper-Threading from Core i7

This site may earn affiliate commissions from the links on this page. Terms of use.

For most of the past decade, Intel has followed a fairly steady set of rules when speccing out its Core i3, i5, and i7 processors. Until last year, Core i3 chips were dual-cores with Hyper-Threading enabled, Core i5 CPUs lacked Hyper-Threading (aka Symmetric Multi-Threading), and Core i7 CPUs were quad-cores with HT enabled. Last year, Intel changed things up, with the Core i3 family bumping to quad cores (no HT), the Core i5 jumped to six cores (also no HT), and the Core i7 retained six cores with Hyper-Threading. Only now, if leaked benchmarks are accurate, HT is going away from the i7 as well.

The data comes from a set of Core i7-9700K results that popped up in SiSoft Sandra and shows the chips as having 8 CPU cores and threads, with no Hyper-Threading in sight. The peak turbo frequency would be 4.9GHz, and that we’d see a Core i9-9900K launch at the same time, Ars Technica reports, with a full eight cores and 16 threads.


Original story by WCCFTech

Let’s assume that Intel’s Hyper-Threading is worth, on average, about 1.2x additional performance. On a six-core CPU, you’d expect HT to add roughly the equivalent of 1.2x “real” CPU cores. At first glance, this doesn’t seem like much of an improvement over the Core i7-8700K — an eight-core chip without HT and a 4.9GHz peak turbo may only be slightly faster than an 8700K. Stripping HT may explain the smaller L3 cache relative to core counts — the Core i5 chips that lack HT offer 6 cores and 9MB of L3, while the Core i7-8700KSEEAMAZON_ET_135 See Amazon ET commerce fields 12MB. The new 9700K supposedly uses 12MB as well, despite adding two additional cores.

Assuming, as always, that these rumors are true, I think we can intuit a bit about what Intel’s strategy with this move would be. It’s not clear how much faster these new Core i7’s would be than the 8700K, but let’s assume Intel can eke out, say, 10 percent more performance from the higher core count, eight physical cores, and slight clock speed kick. That should put the 9700K on par with the 2700X again, if not ahead of it, and priced at roughly $ 350.

Now, Intel rolls up with the Core i9-9900K — an eight-core / 16-thread HT-enabled CPU that’s a further step faster than the Core i7-9700K. It’ll beat the 2700X in single-thread and multi-threaded code, having matched it core-for-core. It’ll be priced to match, at $ 450.

AMD and Intel Are Playing CPU Chicken

At this point, both AMD and Intel are basically daring the other to do something drastic to their high-end desktop CPUSEEAMAZON_ET_135 See Amazon ET commerce product lines. Last year, Intel blinked first with the short-lived Kaby Lake X platform that had no reason to exist and vanished into the ether in short order. AMD’s Threadripper has pounded on Intel’s HEDT pricing by offering far more cores for less cash; Ryzen’s best relative performance against Intel is the $ 1000 Threadripper 1950X against the 10-core Core i9-7900X. With a 32-core Threadripper coming soon, AMD has dared Intel to trim its per-core pricing or be buried under an onslaught of dramatically higher core count chips that offer much better performance.

But by positioning the Core i9-9900K at the top of its own stack in an eight-core / 16-thread configuration, Intel is basically daring AMD to try and bring Ryzen CPUs with higher core counts to the desktop as well. It’s not clear if such chips would fit into AMD’s existing socket infrastructure or not (Threadripper motherboards are typically more expensive than your standard AM4 products). There’s unquestionably room in AMD’s product line for a higher core count chip — the eight-core 1900X is $ 329, while the 12-core Threadripper 1920X is $ 785. But taking on Intel’s eight-core means AMD either needs a 10-core chip that can drop into that $ 450 price range or it needs a hell of a price cut on a 12-core chip.

And this raises another question: Which company currently has more long-term headroom? Unofficial rumors suggest Zen 2 will target a 10-15 percent IPC uplift, but AMD is still trying to close the gap with Intel overall, not surpass it. With Intel stuck on 14nm the momentum advantage is very much on AMD’s side of the equation, but given the fundamental problems with improving silicon performance, it isn’t clear how much daylight the two companies can create between each other, in the final analysis. We’re going to need to see someone jockeying with Intel for the pole position before we can tell to what extent Intel’s silicon scaling problems are unique to Intel or common to the entire industry. Right now, we think those problems tend to be common to the industry and a function of clock speeds and material properties. Based on how the next 18 months plays out, that could change.

Now read: PCMag’s Best Gaming CPUs in 2018

Let’s block ads! (Why?)

ExtremeTechExtreme – ExtremeTech

New AMD GPU Rumors Suggest Polaris Refresh in Q4 2018

This site may earn affiliate commissions from the links on this page. Terms of use.

For the past six months, the consumer GPU market has grappled with three basic questions: When would GPUs be affordable, when would Nvidia launch a refresh to its popular Pascal architecture, and when would AMD launch new parts to hopefully better compete with Nvidia? So far this year, we’ve gotten lower GPU prices SEEAMAZON_ET_82 See Amazon ET commerce, an Nvidia launch event that might or might not happen this fall (rumors on timing and plans and possible inventory hold-over are running rampant), and from AMD’s corner… not much at all. Publicly, the only thing the company has said is that it plans to launch a 7nm Vega refresh for the machine learning market later this year.

But there’s a rumor from Chiphell user WJM47196, who has a fairly good track record on these sorts of announcements, that’s making the rounds on what AMD might be planning for this year — and it’ll probably raise some eyebrows.

First off, there’s a new Polaris family supposedly being prepped for a Q4 2018 introduction. Built on the same 12nm process improvement as AMD’s second-generation Ryzen CPUs, it would offer a 15 percent performance improvement over current cards.

There are only two ways for AMD to deliver this kind of performance improvement, and only one of them makes sense. To improve GPU performance you can make a GPU wider or smaller while keeping the architecture the same, or you can increase the GPU’s clock speed. With architectural changes unlikely, this suggests that AMD would instead simply try to ratchet clocks higher. And who knows? Given that Polaris was the first 14nm discrete GPU GlobalFoundries had ever built, it’s possible that AMD found some low-hanging fruit that would allow it to hit higher clocks. The larger problem is that while a 15 percent jump would give AMD much stronger footing against Nvidia today, it might not compare well against any next-generation hardware Nvidia launches — at least not without price cuts.

Next up, Navi. There’s a rather confused suggestion that Navi will be both a mainstream and high-end part arriving sometime in the 2019 timeframe, and that it will debut in the budget segment first before eventually launching as a high-end, HBM2 equipped part sometime “much later.” The suggested time frame is:

Q4 2018: Polaris 30 (performance up 15 percent).
H1 2019: Navi 10 (budget part, and timing on this introduction is unclear, with additional reference to a Q1 release)
H2 2020: A new, high-end Navi part, as a “true” successor to Vega

This is rather muddled, and I’m not sure at all how much weight I’d put on it. AMD has been widely reported to be working on Navi for Sony’s PlayStation 5 and we know the company has often aligned its console and non-console launches. The idea that the work AMD is doing for Sony could lead to a budget GPU built on an early 7nm process as a pipe-cleaner? Not crazy. Similarly, the idea that AMD would port Polaris to 12nm and spin it for a bit of additional headroom isn’t nuts, either.


AMD’s larger Polaris GPU (pictured above)

If we had to make a bet, here’s where we’d land. The Navi 10 chip, if it exists, is a custom 7nm GPU built for a customer like Apple. This chip gives AMD a better foot into ultra-low-power markets, where its relatively power-hungry GCN architecture can use all the process node improvements it can get. The overhauled Polaris lineup, meanwhile, gives AMD a 15 percent performance kick in desktop, where it can be combined with price cuts to boost AMD’s overall position. This would be further strengthened if Nvidia focuses on refreshing its top-end GPUs first, as the company is likely to do. It wouldn’t be unusual for the introduction of new midrange cards to lag the launch of a GTX 1180/1170 for some months.

The idea that Navi is a single GPU design that stretches across both GDDR6 and HBM2 is a little odd, and likely reflects the fact that Navi is being used to refer to a new architecture rather than a single card. There’s precious little reason why AMD would attempt to take a single GPU across both memory standards when the costs of doing so are so high — the entire GPU memory subsystem and memory controller both have to be rearchitected when moving from GDDR6 to HBM2 or vice-versa. This may not be the only reason why Nvidia has kept its non-Pascal GPUs almost entirely in the professional space, but it’s undoubtedly part of it. But the idea that it could take AMD until Q3/Q4 2020 to replace Vega had better be wrong.

It took AMD ~26 months to move from Fury X to Vega. At the time, this was the longest high-end refresh cycle in history, though Nvidia has actually taken back that record. If the GTX 1180 launches in August, it’ll have taken Nvidia 27 months to pull off a high-end refresh. But Vega is already almost a year old, it didn’t cleanly put AMD back in competition with Nvidia at the highest level, it isn’t going to stand up well against the GTX 1180 or hypothetical 1180 Ti without price cuts to reposition it (not unless Nvidia simply raises its own prices, which, hey, it could do), and the idea of leaning on it as a high-end solution for another two years after an Nvidia refresh is a poor one. It would effectively imply that AMD, which is already operating a GPU cycle behind Nvidia, would be willing to sit out at the high end entirely for another two-year period.

But as we’ve said, this is a rumor — and a rather disjointed rumor at that. Take it with a small mountain of the white stuff.

Now read: PCMag’s Best Graphics Cards of 2018

Let’s block ads! (Why?)

ExtremeTechGaming – ExtremeTech