Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Businesses Facebook Slashdot.org

Nvidia Is Now Worth More Than Meta (pcgamer.com) 60

Nvidia is now a larger company than social media giant Meta. PC Gamer reports: In a meteoric turn of events, Nvidia has surged to become the 7th largest company in the US, despite being nowhere close only a few years ago, and helped along by Meta's recent share price collapse. Meta's fall from stock market grace this past week saw 30% of its share value wiped out, leaving it with a total value in shares, or market cap, of just $615.70B (at time of writing). That's clearly still a lot of money, but it's notably less money than it was worth at the beginning of last week -- around $260B less.

Compare that to Nvidia's market cap of $657.06B, and the green team is out on top. Perhaps not for long, but we'll see. That's still a little shy of Berkshire Hathaway in 6th place at over $720B, but it's markedly higher up than Nvidia was only a few years ago, when its share value was a small fraction of what it is today. [...] Nvidia officially terminated its attempt to buy Arm, the UK-based chip designer, for $40B, and that did see some value wiped off its share price in the following days. Though clearly that dark cloud hasn't stuck around Nvidia's Santa Clara HQ, as it's now back up to around $260. That's over 40% up on its lowest point this year, and just under 22% down on its all-time high of $334.

This discussion has been archived. No new comments can be posted.

Nvidia Is Now Worth More Than Meta

Comments Filter:
  • And yet, despite all that money, they are incapable of writing drivers that don't suck.

    • Well I have 2 Windows machines with NVIDIA cards that have never had a problem with drivers. I do however also have a Linux box and I have had issues with those drivers. So I can say Linux drivers suck I guess.

      • While my gaming laptop with an RTX2060 that runs both Windows and Linux, and I never have any problems with either.

        No Wayland support, but that doesn't really bother me, as X is still well-supported for now.
        • No Wayland support, but that doesn't really bother me, as X is still well-supported for now.

          It's taken 13 years to fail to displace a system that wayland's developers declared to be really rather bad.

          The trouble is that it's just not a very well thought out system. It pushes almost everything on to the compositors and toolkits, so the devs can tell us that's "out of scope" and what's in scope works. But the result feels somehow much more balkanised than under X. You can have a compositor that does remoting.

          • What it does makes a lot of sense.
            It performs better for a reason. That's all it was really designed to do.

            Now, I'm one of the rare people who do use X forwarding every single day, so I'm not eager for the transition to Wayland, but it's happening. It's the default display server on every distribution I can think of now.
            You can still make X forwarding work via XWayland and funky environment vars to tell the TKs to render to the X server/bridge instead of their local window, but it's a little janky.

            Ult
            • It performs better for a reason. That's all it was really designed to do.

              Does it in practice though? There are definitely a few cases where it undoubtedly performs better but in an irrelevant way: it certainly cuts down on the number of context switches to process a keypress compared to X with a compositor. That maybe halves the latency, but that's taking it from microseconds to microseconds, so it's not something that really bothers me.

              For straight line graphics performance, X has some pretty decent paths

              • Does it in practice though? There are definitely a few cases where it undoubtedly performs better but in an irrelevant way: it certainly cuts down on the number of context switches to process a keypress compared to X with a compositor. That maybe halves the latency, but that's taking it from microseconds to microseconds, so it's not something that really bothers me.

                Not just keypresses- all input, and associated actions, like moving a window.
                Also, the direct rendering pathway is much leaner. You're talking directly to the compositor instead of the compositor via the X server. Generally, in benchmarks, there's around a +10% uptick, which who cares about honestly, really it's about the 10% less work done under normal circumstances meaning better power utilization for the increasing number of things that use the GPU for mundane tasks.

                I haven't seen a very coherent argument to that effect. I've seen some pretty disingenuous ones from people who I know know better.

                It took 12 years to get middle click paste ironed out and forwarding is still janky 13 years later! On the balance of evidence I'd say both have been tried and while TK development sounds better in practice it has not proven to be.

                X struck what turned out to be a very good balance (there was no requirement that libinput or mode setting were ever in the server, and on old unix workstations that often wasn't the case with the equivalents). Moving those things out had no effect on the X protocol, or the programs using X11, it was purely an implementation detail of the server.

                Alright, that's fair.
                Ultimately thou

                • Not just keypresses- all input, and associated actions, like moving a window.

                  I'm skeptical of the claim. Once the X server or compositor has the window contents, it just redraws the surface elsewhere.

                  Also, the direct rendering pathway is much leaner.

                  X has direct rendering.

                  I'd also argue that the 12 year metric isn't quite fair. Wayland integration into live desktops didn't really begin in earnest until recently, and it's advancing very quickly.

                  They've supported wayland for ages, but they haven't supported

                  • I'm skeptical of the claim. Once the X server or compositor has the window contents, it just redraws the surface elsewhere.

                    Every movement of the mouse is a pass from the kernel, to the X server, to the compositor, to the client.

                    X has direct rendering.

                    Sure does. However, the communications via the DRM is through the X server, and the X server has to handle contention between its clients (including the compositor)
                    We're on DRI3, now, which almost doesn't suck (we can finally allocate our own buffers, now!)
                    Of course, that comes with its own layer of cruft- specifically PRIME, which sucks butthole.

                    They've supported wayland for ages, but they haven't supported wayland well for ages. We also heard a lot about how X sucks over and over and over. So far it's still taken 13 years and about the best users say with a wayland setup is it works about as well as X and there aren't too many bugs. I think the claims about X sucking just aren't bourne out by the reality of the transition.

                    OK, that's just nonsense. Who is this mystical "they"?
                    G

                    • Every movement of the mouse is a pass from the kernel, to the X server, to the compositor, to the client.

                      Oh ok I misunderstood. But I still don't think that's significant.

                      OK, that's just nonsense. Who is this mystical "they"?
                      GNOME wasn't fully ported to Wayland until 2015.

                      Well quite. It took quote a long time to do the porting.

                      KDE, work started last year.

                      According to this URL:

                      https://community.kde.org/KWin... [kde.org]

                      people were giving presentations about it in 2014, which means work took 8 years.

                      No, absolutely not.

    • just has to be better than AMD's, which is a very low bar.
      • The bar is indeed very low but nVidia is still worlds away from clearing it. The AMD driver takes 269MB out of a 1.2GB kernel source with tens thousands of other drivers, doesn't work when built not as a module -- but at least I can't blame it for a single crash, not being able to run a -rc or even released kernel, artifacts on the screen, etc.

  • Nvidia is next (Score:5, Insightful)

    by wakeboarder ( 2695839 ) on Wednesday February 09, 2022 @08:08PM (#62254509)

    I think it's sitting on a giant bubble, if cyrpto comes down, Nvidia is partially going down with it.

    • I think it's sitting on a giant bubble, if cyrpto comes down, Nvidia is partially going down with it.

      Nvidia hardware has uses besides crpto mining, you know. All a crypto-currency bubble bursting will do is bring video card prices back down to a sane level.

    • That's probably why they want to use their bubble money to buy stuff. Diversify as fast as possible.

    • Nvidia probably would or will take a loss on mining changes etc. However they have a valuable business model that is non bubble related, I think from the point of view of Nvidia it is a great time to be a GPU manufacturer and if they were not printing money now then why not.

      In another way I might say Nvidia was always more valuable than Meta but the market has just caught up with the fact.
    • Shows you don't know NV's revenue streams very well. There ARE threats to their future, but that isn't a major one.

    • by Sloppy ( 14984 )

      Yeah, no one wants to see computers make pictures.

      • You can't freaking buy a card to make it make pictures, they are being sold to turn electricity into heat so people can win a lottery.

    • I don't know how much Nvidia gains from crypto buyers but their hardware has lots of uses beyond that: Their GPUs are heavily used in the AI field and to make supercomputers. Their chips are also used on cars and, of course, consumer GPUs
    • And yet nVidia produces tangible products vs Meta having an approximately equal valuation for what boils down to gathering advertising data... I have felt like the latter was surely a bubble as well for a long time.
    • by fazig ( 2909523 )
      They have the digital art sector already cornered.
      There they offer a lot of proprietary solutions that artists (for video games, advertising, and many other fields in modern media) have become dependent on over the last decade.

      Their profits will shrink if crypto mining goes down, and I hope it crashes and burns in hell. They'll finally have to make decent graphics cards again, because people would not longer buy whatever crap they release, because of lack of choices. But they're nowhere near in trouble b
  • Come a long way (Score:5, Interesting)

    by jacks smirking reven ( 909048 ) on Wednesday February 09, 2022 @08:19PM (#62254533)

    I remember when NVidia was hot on the scene with the Riva TNT card and as someone who was using a Voodoo 1 card being a little jealous that you could have just one graphics card for both 2d and 3d. Just a few years later they were buying 3dfx no less.

    Honestly well deserved, especially compared to Facebook. NVidia are kind of dicks but they have consistently been the ones pushing a lot the graphics and computation advances the past decade. Would like them to have more competetion though.

    • Guess you never got a Voodoo Banshee or Voodoo3?

      • I have a Voodoo, a Voodoo Rush, and a Voodoo 3.
        Never had a Banshee, but the Rush was the first combined card anyway, not the Banshee.
        Then a Voodoo5 5500. I was invested in 3dfx.
        Then I learned that the GeForce supported Harware T&L, which made my Voodoo5 look like poopoo. It got wrecked in every benchmark. Unfortunately, 3dfx, though pioneers, were never able to keep up with NV.

        It's no surprise NV ended up purchasing their skeletal remains.
        • I had dual Voodoo Banshees with some hacked up support in Xfree86 to make it go. They made the Xterm scrolling go brrrr.
          Once the GeForce 256 came out, I bought it and never looked back.

          The PC VGA industry has always been a meat grinder. You fall behind in a product's release time, its performance, or fail to meet the price of your competitors. And you have wasted a lot of capital and essentially need to start over and invest in brand new technologies. Only companies that either never make mistakes or have a

        • 3dfx fucked themselves by deciding to make all of their chip customers their competition when they stopped being a chip designer / supplier and started retailing their own cards. Literally every one of their OEMs gave them the double-bird and selected Nvidia and S3 (remember them?) chips for their next generation and 3dfx revenue collapsed.

          Nvidia only started doing the same after there was basically no competition that wasn't already doing the same (AMD). Now it's just expected that you can buy the refere

        • Voodoo Rush was so bad it's almost worth not mentioning. I had one. It was disappointing to say the least.

          Also if you look at Voodoo Rush it was more of a two-card solution. They had a 2d card with a 3d daughtercard.

          • Voodoo Rush was so bad it's almost worth not mentioning. I had one. It was disappointing to say the least.

            It wasn't bad at the time. It as 2D+ 3D, which was really cool.

            Also if you look at Voodoo Rush it was more of a two-card solution. They had a 2d card with a 3d daughtercard.

            Mine sure as hell didn't. Though the "3dfx" chips, and the 2D chips were on different halves of the card. If memory serves, you could get them with one of several different 2D chipset vendors, so I wouldn't be surprised if some kind of stacked card did exist.

    • NVidia are kind of dicks but they have consistently been the ones pushing a lot the graphics and computation advances the past decade.

      summed up quite nicely.
      also, their products have been "the good shit" basically since inception.

      looks like AMD will be the first to launch MCM GPUs, and those should be very strong performers...
      but given nvidia's history, im more excited about nvidia's theoretical response than AMD's actual product.

  • by hdyoung ( 5182939 ) on Wednesday February 09, 2022 @08:19PM (#62254537)
    Their true product is ad-sales and user data. In other words, a slightly modernized version of Mad Men. And there’s always people waiting in the wings to fleece that particular sheep. Their services are nothing more than honeypots to harvest the user data and sell the adds. And social networking websites and video streaming sites simply aren’t that difficult or unique to set up. In other words, Facebook is imminently replaceable.

    Nvidia, on the other hand, makes hard-to-design and hard-to-manufacture GPUs. If they disappeared tomorrow, other manufacturers could step up but it wouldn’t be easy or fast.

    Nvidia should be worth WAY more than Facebook, but markets do extremely non-rational things.
    • by registrations_suck ( 1075251 ) on Wednesday February 09, 2022 @08:48PM (#62254589)

      You could summarize your wise post this way:

      N idea does something genuinely useful. FaceBook does not.

      • Exactly this. Nvidia actually makes a product that can do useful things. Facebook just collects product (your data) and sells it to literally anyone that can agree on a price. They are digital middle-men.

    • by istartedi ( 132515 ) on Wednesday February 09, 2022 @10:51PM (#62254757) Journal

      Replacing Nvidia requires an end to the chip fabrication bottleneck/supply issue AND a company that can design better and/or cheaper cards. Agreed that it's harder than coming up with the next BaceFook; but they've got a nice target on their back now. US and Europe are rushing to build chip fabs because they finally woke up and realized that outsourcing it all to Asia might not have been such a bright idea. In 5 years there could be a turn-around. We could actually have a glut of fabs, falling chip prices, and a fantastic opportunity for the other players to step up and eat Nvidia's lunch.

      • but they've got a nice target on their back now. US and Europe are rushing to build chip fabs because they finally woke up and realized that outsourcing it all to Asia might not have been such a bright idea.

        Hardly. The chip shortage that is being addressed in EU and USA have nothing to do with the cutting edge fabrication of NVIDIA and the rest of the PC chip industry. Only Intel's fabs being built is going to compete with them. All other investments in this space are a completely different node size.

      • by ceoyoyo ( 59147 )

        Nvidia is a fabless chip designer. More fabs is *good* for them. You don't think Ford is screwed as soon as people can buy STM32s again, do you?

        Nvidia's really good at designing GPUs, and they only have one real competitor. But I think their real strength is the work they've put into CUDA. I'd like OpenCL to succeed, but it just doesn't seem to have the reach CUDA does.

    • Markets do non-rational things? FB EPS 3.67 NVIDIA EPS 1.17 FB Free cash flow 4.37 NVIDIA Free cash flow 1.01 FB PE 16 NVIDIA PE 73 (Price to earnings lower is "better" *all things being equal*) From a financial standpoint FB just makes more money and profit than NVIDIA . One could argue why NVIDIA a company that produces less profit, less free cash flow is valued higher than FB
  • Wait. (Score:5, Funny)

    by NoNonAlphaCharsHere ( 2201864 ) on Wednesday February 09, 2022 @08:54PM (#62254601)
    So you're saying that a company that makes stuff (well) is more valuable than a company that (to put it kindly) doesn't do anything of any positive, recognizable value? That's un-American!
  • And it should be (Score:5, Insightful)

    by backslashdot ( 95548 ) on Wednesday February 09, 2022 @09:28PM (#62254655)

    nVidia tech is actual technology that requires our brightest brains to develop and create, whereas Meta is like anyone with a GED can come up with ideas for it, like hey dudes let us make legless avatars and call it metaverse. CPU design on the other hand requires you to know actual academic level shit. You have to do actual STEM. At least for two or three years until the EDA software can entirely design the cpu and put together IP blocks according to the fab rules. Then it will be analog, semiconductor materials, and device physics that will need actual STEM skills.

    • While Facebook itself is a bucket of shit dreamt up by someone who stole someone else's bucket of shit ideas, dismissing all of Meta as the stupidity for the metaverse is just stupid.

      They do pour huge amounts of R&D into datacentre design, network design, they laid some serious foundation work to big database design, they created and maintain React (framework used by very many applications). They are also providing the largest R&D for VR in general, not just the fucking stupid metaverse concept. Not

    • There are a lot of wild projects going on at Meta. Both Google and Facebook have long spent money on some pretty crazy stuff that is outside of their primary business. I think subconsciously they knew their business model is shit and are looking for "the next big thing" to pivot into if the grift is ever revealed.

      Other companies like Nvidia are more focused on their hardware and software platforms. Pretty much all public info on what NV does in their R&D seems to be somehow related to GPUs. Gaming, comp

    • nVidia tech is actual technology that requires our brightest brains to develop and create, whereas Meta is like anyone with a GED can come up with ideas for it, like hey dudes let us make legless avatars and call it metaverse. CPU design on the other hand requires you to know actual academic level shit. You have to do actual STEM. At least for two or three years until the EDA software can entirely design the cpu and put together IP blocks according to the fab rules. Then it will be analog, semiconductor materials, and device physics that will need actual STEM skills.

      It's always amusing to me when someone says something like this. It's so obvious, yet for some reason guys like you and I didn't manage to create a company hundreds of billions of dollars, with hundreds of millions of daily interactions, who manages to collect data on people from countless other sites who stick share icons and use Facebook for logon authentication, and made it's founder a multi-billionaire.

      I don't know why you didn't do this yourself. What Nvidia does is amazing. What Facebook does, also

  • As opposed to just being a website with an overblown sense of purpose. Nvidia should be worth more, and always was in our non-damaged, parallel world.

    • Wrong. Intel and Apple were sleeping at the wheel and allowed this to happen. Most of the pipelines are duplicated, some RISC-V like, and pushing memory speeds and interconnects for all its worth. Apple with money to burn, IF they can book TSMC space, is in stage position to have something that is energy efficient and good. They were late on the CPU stage, and can start clean on 4nm, with no worry about legacy drivers. The math for processing is well known. Intel, now are probably thing, hey, that was not
    • It's not a matter of its value to society or even to the individual. It's a matter of how much revenue could potentially be generated. How many GPUs does each person in the first world need? 1 or 2 maybe?

      Now how many products can Facebook hawk to users, thousands a month to each user. How much personal data can they collect from users to feed premium revenue for targeted ads? Ultimately the scale of the business for Facebook is just bigger than what a graphics card company will see. Facebook's business mode

  • So they just halved the company value by rebranding it...
    Sit nomen omen
  • I'm waiting for the company formerly known as Facebook to reach parity with MySpace before I buy.
  • My old left dog-chewed slipper is worth more than Meta.
  • The main reason the Meta stock price tanked, seems to be because Peter Thiel left and investors think Marc Zucks.
  • One company makes stuff and one company sells ads that no one sees for a dead platform.
  • At least NVidia produces something useful.
  • To their marketing dept.

    They have created an immense group of drones that will blindly worship Nvidia no matter what crap they do to the industry or their own customers.

    They are also in charge of sending the lovely loyalty checks to all the websites like TechSpot and Youtubers like LTT, so they continue posting crap about Nvidia non stop which ends up as..food for the nvdrones!

    I mean, watch any tech video in youtube and even if the video itself is not about nvidia, they will make sure to mention them and of

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...