Jump to content

ChatGPT AI Tool— We all work for robots now


956 Worldwide

Recommended Posts

1 hour ago, Goredho said:

The rest of the piece argues for continued export controls on chips to China, on the basis that if future AI unlocks "extremely rapid advances in science and technology" the US needs to get their first, due to his concerns about "military applications of the technology".

Ya know, applying brutal tarrifs against Taiwan and TSMC - the ONLY people in the world who can make these chips - is really gonna help the strategic availability of those chips

Edited by Captainant
  • Hook 'Em 2
Link to comment
Share on other sites

https://x.com/wallstengine/status/1885253685090930780

Spoiler

SemiAnalysis published an analysis on DeepSeek, addressing recent claims about its cost and performance. $NVDA The report states that the widely circulated $6M training cost for DeepSeek V3 is incorrect, as it only accounts for GPU pre-training expenses and excludes R&D, infrastructure, and other critical costs. According to their findings, DeepSeek’s total server CapEx is around $1.3B, with a significant portion allocated to maintaining and operating its GPU clusters. The report also states that DeepSeek has access to roughly 50,000 Hopper GPUs, but clarifies that this does not mean 50,000 H100s, as some have suggested. Instead, it’s a mix of H800s, H100s, and the China-specific H20s, which NVIDIA has been producing in response to U.S. export restrictions. SemiAnalysis points out that DeepSeek operates its own datacenters and has a more streamlined structure compared to larger AI labs. On performance, the report notes that R1 matches OpenAI’s o1 in reasoning tasks but is not the clear leader across all metrics. It also highlights that while DeepSeek has gained attention for its pricing and efficiency, Google’s Gemini Flash 2.0 is similarly capable and even cheaper when accessed through API. A key innovation cited is Multi-Head Latent Attention (MLA), which significantly reduces inference costs by cutting KV cache usage by 93.3%. The report suggests that any improvements DeepSeek makes will likely be adopted by Western AI labs almost immediately. SemiAnalysis also mentions that costs could fall another 5x by the end of the year, and that DeepSeek’s structure allows it to move quickly compared to larger, more bureaucratic AI labs. However, it notes that scaling up in the face of tightening U.S. export controls remains a challenge.

 

  • Fuck You 1
Link to comment
Share on other sites

State of Texas bans it from government devices, along with some Chinese social media stuff.  Private citizens can still access it, of course.  There are rumblings elsewhere that they are going to try and block it for everybody, but also that there is resistance and Constitutional issues.

https://m10news.com/texas-becomes-first-state-to-ban-chinese-ai-app-deepseek-and-social-media-platform-rednote/

Link to comment
Share on other sites

1 hour ago, atomheartbevo said:

State of Texas bans it from government devices, along with some Chinese social media stuff.  Private citizens can still access it, of course.  There are rumblings elsewhere that they are going to try and block it for everybody, but also that there is resistance and Constitutional issues.

https://m10news.com/texas-becomes-first-state-to-ban-chinese-ai-app-deepseek-and-social-media-platform-rednote/

I mean, you can also just download it and run it locally. There is no practical way to block it. 

  • Hook 'Em 1
Link to comment
Share on other sites

29 minutes ago, Dahobbs said:

I mean, you can also just download it and run it locally. There is no practical way to block it. 

There is also noise about banning it nationally somehow, and that’s the fucking bat signal for /r/DataHoarders to get to work and download every version they can find and post it aorund the internet. 

  • Haha 1
Link to comment
Share on other sites

19 hours ago, atomheartbevo said:

There is also noise about banning it nationally somehow, and that’s the fucking bat signal for /r/DataHoarders to get to work and download every version they can find and post it aorund the internet. 

There's genuine free speech issues around the idea of banning a specific configuration of weights and biases in a transformer architecture.

So I'm sure it'll go through without a hitch, knowing our current timeline

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.



×
×
  • Create New...