• contact us
  • Home 1
  • Privacy Policy
Get Up Radio Media News Broadcasting
ADVERTISEMENT
  • Home
  • Business
  • Crypto News
  • Finance
  • Health
  • Politics
  • Sports
  • Stock
  • Tech
  • Travel
  • World News
  • 24/7 Radio
  • Finance and Money Management
  • Vegan Recipes
No Result
View All Result
  • Home
  • Business
  • Crypto News
  • Finance
  • Health
  • Politics
  • Sports
  • Stock
  • Tech
  • Travel
  • World News
  • 24/7 Radio
  • Finance and Money Management
  • Vegan Recipes
No Result
View All Result
Get Up Radio Media News Broadcasting
No Result
View All Result
Home Tech

How to talk about A.I. like an insider

admin by admin
May 21, 2023
in Tech
0
0
SHARES
0
VIEWS
Share on TwitterShare on LinkedinShare on Facebook


See also: Parrots, paperclips, and safety vs ethics: Why the artificial intelligence debate sounds like a foreign language

Here’s a list of some terms used by AI insiders:

AGI — AGI stands for “artificial general intelligence.” As a concept, it’s used to mean a significantly more advanced AI than is currently possible, that can do most things as well or better than most humans, including improving itself.

Example: “For me, AGI is the equivalent of a median human that you could hire as a coworker, and they could say do anything you would be happy with a remote coworker doing behind a computer,” Sam Altman said at a recent Greylock VC event.

AI ethics describes the desire to prevent AI from causing immediate harm, and often focuses on questions like how AI systems collect and process data and the possibility of bias in areas like housing or employment.

AI safety describes the longer-term fear that AI will progress so suddenly that a super-intelligent AI might harm or even eliminate humanity.

Alignment is the practice of tweaking an AI model so that it produces the outputs its creators desired. In the short term, alignment refers to the practice of building software and content moderation. But it can also refer to the much larger and still theoretical task of ensuring that any AGI would be friendly towards humanity.

Example: “What these systems get aligned to — whose values, what those bounds are — that is somehow set by society as a whole, by governments. And so creating that dataset, our alignment dataset, it could be, an AI constitution, whatever it is, that has got to come very broadly from society,” Sam Altman said last week during the Senate hearing.

Emergent behavior — Emergent behavior is the technical way of saying that some AI models show abilities that weren’t initially intended. It can also describe surprising results from AI tools being deployed widely to the public.

Example: “Even as a first step, however, GPT-4 challenges a considerable number of widely held assumptions about machine intelligence, and exhibits emergent behaviors and capabilities whose sources and mechanisms are, at this moment, hard to discern precisely,” Microsoft researchers wrote in Sparks of Artificial General Intelligence.

Fast takeoff or hard takeoff — A phrase that suggests if someone succeeds at building an AGI that it will already be too late to save humanity.

Example: “AGI could happen soon or far in the future; the takeoff speed from the initial AGI to more powerful successor systems could be slow or fast,” said OpenAI CEO Sam Altman in a blog post.

Foom — Another way to say “hard takeoff.” It’s an onomatopeia, and has also been described as an acronym for “Fast Onset of Overwhelming Mastery” in several blog posts and essays.

Example: “It’s like you believe in the ridiculous hard take-off ‘foom’ scenario, which makes it sound like you have zero understanding of how everything works,” tweeted Meta AI chief Yann LeCun.

GPU — The chips used to train models and run inference, which are descendants of chips used to play advanced computer games. The most commonly used model at the moment is Nvidia’s A100.

Example: From Stability AI founder Emad Mostque:

Guardrails are software and policies that big tech companies are currently building around AI models to ensure that they don’t leak data or produce disturbing content, which is often called “going off the rails.” It can also refer to specific applications that protect the AI from going off topic, like Nvidia’s “NeMo Guardrails” product.

Example: “The moment for government to play a role has not passed us by this period of focused public attention on AI is precisely the time to define and build the right guardrails to protect people and their interests,” Christina Montgomery, the chair of IBM’s AI ethics board and VP at the company, said in Congress this week.

Inference — The act of using an AI model to make predictions or generate text, images, or other content. Inference can require a lot of computing power.

Example: “The problem with inference is if the workload spikes very rapidly, which is what happened to ChatGPT. It went to like a million users in five days. There is no way your GPU capacity can keep up with that,” Sid Sheth, founder of D-Matrix, previously told CNBC.

Large language model — A kind of AI model that underpins ChatGPT and Google’s new generative AI features. Its defining feature is that it uses terabytes of data to find the statistical relationships between words, which is how it produces text that seems like a human wrote it.

Example: “Google’s new large language model, which the company announced last week, uses almost five times as much training data as its predecessor from 2022, allowing its to perform more advanced coding, math and creative writing tasks,” CNBC reported earlier this week.

Paperclips are an important symbol for AI Safety proponents because they symbolize the chance an AGI could destroy humanity. It refers to a thought experiment published by philosopher Nick Bostrom about a “superintelligence” given the mission to make as many paperclips as possible. It decides to turn all humans, Earth, and increasing parts of the cosmos into paperclips. OpenAI’s logo is a reference to this tale.

Example: “It also seems perfectly possible to have a superintelligence whose sole goal is something completely arbitrary, such as to manufacture as many paperclips as possible, and who would resist with all its might any attempt to alter this goal,” Bostrom wrote in his thought experiment.

Singularity is an older term that’s not used often anymore, but it refers to the moment that technological change becomes self-reinforcing, or the moment of creation of an AGI. It’s a metaphor — literally, singularity refers to the point of a black hole with infinite density.

Example: “The advent of artificial general intelligence is called a singularity because it is so hard to predict what will happen after that,” Tesla CEO Elon Musk said in an interview with CNBC this week.



Related

Tags: business newsInsiderSciencetalk
Previous Post

Bitcoin Accumulation: What Investors Need To Watch Out For

Next Post

Are white noise machines effective?

Next Post

Are white noise machines effective?

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected test

ADVERTISEMENT
  • Trending
  • Comments
  • Latest

Stocks making the biggest moves midday: Rivian, Western Alliance, Peloton and more

April 20, 2023

Shimano recalls 680,000 bicycle cranksets after reports of bone fractures and lacerations

September 22, 2023

Diwali Muhurat Trading 2023: Sensex jumps 336 points; Nifty above 19,500

November 12, 2023

Crypto Detective ZachXBT Faces Defamation Lawsuit

June 16, 2023

Ethereum Price To Reclaim $1,300, What Are The Possibilities?

0

Tech layoffs in Southeast Asia mount as unprofitable startups seek to extend their runways

0

Stock market update: Stocks that hit 52-week lows on NSE in today’s trade

0

Wealthy Chinese keep on spending while others cut back, survey finds

0

Apple is trying to unwind its Goldman Sachs credit card partnership

November 29, 2023

India-Sri Lanka joint military exercise Mitra Shakti 2023 culminates in Pune

November 28, 2023

We will look into it if disclosures on IPO valuation are meaningless, says Sebi chief

November 28, 2023

Chinese hospitals overwhelmed with spike in respiratory illnesses, mostly in children

November 28, 2023

Recent News

Apple is trying to unwind its Goldman Sachs credit card partnership

November 29, 2023

India-Sri Lanka joint military exercise Mitra Shakti 2023 culminates in Pune

November 28, 2023

We will look into it if disclosures on IPO valuation are meaningless, says Sebi chief

November 28, 2023

Chinese hospitals overwhelmed with spike in respiratory illnesses, mostly in children

November 28, 2023

We bring the latest news from all over the world and get all time updated you

Follow Us

Browse by Category

  • Business
  • Crypto News
  • Finance
  • Health
  • Politics
  • Sports
  • Stock
  • Tech
  • Travel

Recent News

Apple is trying to unwind its Goldman Sachs credit card partnership

November 29, 2023

India-Sri Lanka joint military exercise Mitra Shakti 2023 culminates in Pune

November 28, 2023
  • contact us
  • Home 1
  • Privacy Policy

© 2023 Get Up Radio Media New - Premium WordPress news & magazine theme by Jegtheme.

No Result
View All Result
  • contact us
  • Home 1
  • Privacy Policy

© 2023 Get Up Radio Media New - Premium WordPress news & magazine theme by Jegtheme.

  • Stride Staked InjectiveStride Staked Injective(STINJ)$16.49-4.32%
  • bitcoinBitcoin(BTC)$37,987.002.09%
  • ethereumEthereum(ETH)$2,052.251.22%
  • tetherTether(USDT)$1.000.05%
  • binancecoinBNB(BNB)$230.071.41%
  • rippleXRP(XRP)$0.611.19%
  • solanaSolana(SOL)$58.416.02%
  • usd-coinUSDC(USDC)$1.00-0.12%
  • staked-etherLido Staked Ether(STETH)$2,054.861.46%
  • cardanoCardano(ADA)$0.3859781.89%
  • dogecoinDogecoin(DOGE)$0.0809463.30%
  • tronTRON(TRX)$0.1032022.61%
  • ToncoinToncoin(TON)$2.43-0.67%
  • chainlinkChainlink(LINK)$14.582.12%
  • avalanche-2Avalanche(AVAX)$20.852.68%
  • matic-networkPolygon(MATIC)$0.751.32%
  • polkadotPolkadot(DOT)$5.292.33%
  • VectoriumVectorium(VECT)$425.81-3.49%
  • wrapped-bitcoinWrapped Bitcoin(WBTC)$37,955.002.15%
  • Wrapped stETHWrapped stETH(WSTETH)$1,864.680.37%
  • daiDai(DAI)$1.000.09%
  • litecoinLitecoin(LTC)$69.931.33%
  • shiba-inuShiba Inu(SHIB)$0.0000081.98%
  • uniswapUniswap(UNI)$6.27-0.65%
  • bitcoin-cashBitcoin Cash(BCH)$223.530.42%
  • leo-tokenLEO Token(LEO)$3.92-1.43%
  • okbOKB(OKB)$56.991.47%
  • stellarStellar(XLM)$0.1185281.30%
  • ftx-tokenFTX(FTT)$23.71-3.46%
  • true-usdTrueUSD(TUSD)$1.000.09%
  • moneroMonero(XMR)$167.160.25%
  • KaspaKaspa(KAS)$0.1339434.41%
  • Bitcoin Cash ABCBitcoin Cash ABC(BCHA)$151.06-6.64%
  • cosmosCosmos Hub(ATOM)$9.270.18%
  • ethereum-classicEthereum Classic(ETC)$18.861.86%
  • crypto-com-chainCronos(CRO)$0.0922091.54%
  • BitTorrentBitTorrent(BTT)$0.003681-1.64%
  • BSCEXBSCEX(BSCX)$237.310.49%
  • filecoinFilecoin(FIL)$4.53-0.12%
  • lido-daoLido DAO(LDO)$2.33-1.18%
  • internet-computerInternet Computer(ICP)$4.592.49%
  • hedera-hashgraphHedera(HBAR)$0.061080-0.37%
  • AptosAptos(APT)$7.041.54%
  • nearNEAR Protocol(NEAR)$1.822.40%
  • thorchainTHORChain(RUNE)$6.038.18%
  • binance-usdBUSD(BUSD)$1.00-0.14%
  • immutable-xImmutable(IMX)$1.330.49%
  • MantleMantle(MNT)$0.534.96%
  • vechainVeChain(VET)$0.0216221.23%
  • optimismOptimism(OP)$1.72-0.72%