Comments Locked

100 Comments

Back to Article

  • WaltC - Monday, August 9, 2021 - link

    Hmmmm...what is "open" about it, exactly?...;) Always so informative to see buzzwords like "synergistic" every now and then [not]...;) OCP reminds me of the ancient "ARB Committee" that shepherded OpenGL right into becoming an historical footnote (Vulkan replaced it as a 3d API.) ARB had a lot of big names associated with it, but they seldom agreed and the progress of the API was glacial, accordingly. But like "synergistic" the "ARB Committee" certainly sounded important, even though it was fairly ineffectual.
  • Wrs - Monday, August 9, 2021 - link

    Data centers aren’t going away. This is a committee that helps them become more efficient and more compatible. I’m all for that. They’re going through tech that may or may not trickle down to consumer products, but rest assured the consumer ultimately foots the bill for inefficiencies and troubleshooting up in the cloud.

    OpenGL had strong competition from Direct3d, but it was still important for whatever wasn’t Microsoft. Vulcan and OpenGL should not be compared, though, as they’re for different eras.
  • whatthe123 - Monday, August 9, 2021 - link

    ??? A lot of the same companies on the ARB committee were founders of khronos group, who maintained opengl and created vulkan. So you're arguing these committees are ineffective by giving an example where it was highly effective and adopted industry wide?
  • mode_13h - Tuesday, August 10, 2021 - link

    Hmm... the second paragraph seems to sum it up:

    > standardization. Without a consistent size, depth, or definition to the size of a server,
    > a deployment can easily end up as a hodge-podge of malformed hardware with no
    > discernable high-level design methodology. While the silicon vendors or the OEM
    > partners building the systems could have their own way of doing things,

    It's not just unifying vendors, either. Google had/has their own way of building systems, too. If every manufacturer has their own preferred approaches and every big customer has their own preferences, the result is less efficiency for both customers and manufacturers!

    > OCP reminds me of the ancient "ARB Committee" that shepherded OpenGL
    > right into becoming an historical footnote

    Like most technology standards, OpenGL was relevant until it wasn't (although it's not dead yet!). As @whatthe123 pointed out, OpenGL helped establish Khronos Group and got software & hardware vendors collaborating in an open way that gave rise to Vulkan!

    > the progress of the API was glacial, accordingly.

    It kept pace pretty well with Direct3D, sometimes even getting ahead of it. I don't know if you have any examples you'd like to cite, but it seems to me that OpenGL evolved about as rapidly as the hardware. And *more* rapidly than most software! Plus, it had an extension framework that enabled GPU vendors to trial new features and extensions in a way that didn't require consensus but still helped paved the road ahead.
  • willis936 - Friday, August 13, 2021 - link

    >OpenGL was relevant until it wasn't

    Vulkan regularly outclasses less general graphical APIs. Vulkan is a combination of OpenGL and OpenGL ES. Is quake's game engine not relevant even though nearly every first person shooter has some code from it?
  • mode_13h - Saturday, August 14, 2021 - link

    > Vulkan is a combination of OpenGL and OpenGL ES.

    No, it's not. It's a fundamental reformulation of a graphics API. AMD's Mantle was probably the biggest contributor, but I think even that served more as inspiration and a proof of concept.

    It took quite a lot of work to write Zink, which is an OpenGL implementation built atop Vulkan.

    https://www.collabora.com/news-and-blog/blog/2018/...

    If you're not writing something like a game engine, OpenGL is a lot easier to work with than Vulkan. And that's saying something, as OpenGL isn't exactly a simple API.

    > Is quake's game engine not relevant

    Huh? Who said anything about Quake?
  • wrkingclass_hero - Monday, August 9, 2021 - link

    "An Interview with Intel's Rebecca Weekly"
    Once is enough
  • mode_13h - Tuesday, August 10, 2021 - link

    :D
  • Paradroid888 - Tuesday, August 10, 2021 - link

    Is this going to be a weekly interview?
  • wrkingclass_hero - Wednesday, August 11, 2021 - link

    I think it already is.
  • JayNor - Monday, August 9, 2021 - link

    "the new CXL standard is about having a fundamental understanding of the TLB on a device that is not my own!"

    I thought CXL was more about having a Home Agent for the accelerator located on the host, along with the biased cache coherency. Wouldn't the TLB cache still be on the accelerator?
  • Ryan Smith - Monday, August 9, 2021 - link

    The next sexist post I have to delete will come with a ban. Be better than that.
  • Ryan Smith - Monday, August 9, 2021 - link

    Ban count (thus far): 1
  • CallumS - Monday, August 9, 2021 - link

    Kudos, Ryan. Please keep it up!

    It was/is a great interview!
  • shabby - Monday, August 9, 2021 - link

    Please add an edit button!
  • Ryan Smith - Tuesday, August 10, 2021 - link

    While I get why people want this request, I've held off on adding an edit button because of all the additional moderation work it adds on our end. Not only would we have to review new comments, but edited comments as well. Especially with spammers (who have been working overtime the past month), it really adds up. Right now the trade-off is likely not worth it.
  • TristanSDX - Tuesday, August 10, 2021 - link

    Button 'Edit' could be active only 10 min, after posting, many forums have such solution
  • shabby - Tuesday, August 10, 2021 - link

    Bingo!
  • shabby - Tuesday, August 10, 2021 - link

    Isn't that what the report button is for? If had one...
  • shabby - Tuesday, August 10, 2021 - link

    If we had one 🙄
  • mode_13h - Tuesday, August 10, 2021 - link

    Thanks for taking out the trash!
    : )
  • Samus - Tuesday, August 10, 2021 - link

    Frankly I'm proud of how the female AT readers are holding back the comments about Ian. I mean look at that smile it melts me!
  • Spunjji - Friday, August 13, 2021 - link

    Much appreciated.
  • Threska - Monday, August 9, 2021 - link

    "Cascade Lake OCP Server"

    Going to be interesting seeing the rack that goes into.
  • Ian Cutress - Monday, August 9, 2021 - link

    https://www.anandtech.com/show/14755/hot-chips-31-...
  • RSAUser - Monday, August 9, 2021 - link

    This article reads like a fluff piece with no content.
  • DigitalFreak - Monday, August 9, 2021 - link

    Did you expect something else? This site went down hill when Anand left and they sold out to Purch 7 years ago. It's gotten worse under Future. It's truly sad to see what was once one of the top tech sites reduced to publishing PR pieces and regurgitating press releases along with the occasional late review. RIP Anandtech 1997-2014
  • Amandtec - Tuesday, August 10, 2021 - link

    Bah. The site is fine. I have been here since 1997. Press releases are part of tech news. Working with big tech is quid pro quo.
  • mode_13h - Tuesday, August 10, 2021 - link

    There's plenty of worthwhile content and articles or details you won't find elsewhere. If you think it's really so bad, why are you even here?

    While their coverage is much too sparse to be my only tech news source, I often find more depth than at most other sites. That's why I keep coming back.
  • Oxford Guy - Wednesday, August 11, 2021 - link

    ‘If you think it's really so bad, why are you even here?’

    Logical fallacy. Do you know its name?
  • mode_13h - Thursday, August 12, 2021 - link

    Quit trolling. Either reply to the substance of the issue or move on.

    And I do think it's a legitimate question I put to @DigitalFreak, who apparently considers the site dead for the past 7 years.
  • Oxford Guy - Thursday, August 12, 2021 - link

    ‘Quit trolling. Either reply to the substance of the issue or move on.’

    That’s not a satisfactory rebuttal for being caught posting yet another fallacy.
  • mode_13h - Friday, August 13, 2021 - link

    Quit wasting my time and yours. If it's not relevant and of intrinsic value, don't bother posting it.
  • Oxford Guy - Sunday, August 15, 2021 - link

    ‘Quit wasting my time and yours’

    Evidently, trying to get you to fathom why your post contained a fallacy and why your frequent use of fallacies here is detrimental to the quality of discourse constitutes time wasting in your estimation.

    Your refusal to admit fault, state the name of the fallacy you employed, and pledge to do better in the future is noted.
  • mode_13h - Monday, August 16, 2021 - link

    This is a game you're playing by yourself. Antagonizing me is not contributing to the discourse.
  • Spunjji - Friday, August 13, 2021 - link

    @Oxford Guy - If you want to talk fallacies, you just put the fallacy fallacy on display here. Whether or not a logical fallacy was technically committed, it's fair to ask someone who has declared a site to have *died* in 2014 why they're still here in 2021 coming back for content they say they don't like. The implication is that their statement is hyperbole, not that they should go away.
  • mode_13h - Saturday, August 14, 2021 - link

    Thanks. Yes, exactly.
  • Oxford Guy - Sunday, August 15, 2021 - link

    Your attempt at a rebuttal does not justify Mode’s use of the fallacy.

    Your rebuttal attempt is also the tu quoque fallacy.
  • martinpw - Sunday, August 15, 2021 - link

    @oxford guy Do you have any idea how ridiculous you sound?
  • mode_13h - Monday, August 16, 2021 - link

    You use of Latin does not justify worthless posts.
  • damianrobertjones - Tuesday, August 10, 2021 - link

    Although I do agree, you still have to make the cash and, if I'm honest, Anand was far, FAR too interested with Apple. It got annoying fast.
  • Oxford Guy - Wednesday, August 11, 2021 - link

    A 2 trillion market valuation in tech is definitely unworthy of significant coverage.
  • mode_13h - Thursday, August 12, 2021 - link

    > A 2 trillion market valuation in tech

    It wasn't, back then. And their recent success has a lot less to do with their tech and a lot more to do with marketing and business strategy.

    Not only that, but publishing is a lot about knowing your audience. If most readers are PC users, then annoyance with an Apple preoccupation is understandable, especially if there's not sufficiently compelling technical achievements to justify it.
  • Oxford Guy - Thursday, August 12, 2021 - link

    ‘It wasn't, back then.’

    Few people start successful tech sites at 14, too.

    Thank you for the predictably banal attempt at being argumentative, though. As I’ve said before, I generally know what you’ll post before you do, as specious arguments aren’t difficult to concoct.

    Apple was very influential long before Anandtech existed. It was the company that got consumers reliable removable storage (affordable floppy drives, via Woz’s software controller innovation), a highly-refined GUI (two, actually) — whilst Microsoft was peddling much inferior UI... et cetera.

    Moreover, the founder of a site who has a strong track record for delivering high-quality tech reporting is entitled to enough latitude to cover a major tech firm.

    All of this is patently obvious and therefore should not require comment.
  • mode_13h - Friday, August 13, 2021 - link

    > I generally know what you’ll post before you do

    I don't consider that a bad thing. Unpredictability isn't one of my priorities. Indeed, it's hard to see how one could take a fair and logical view of matters, while also being completely unpredictable. Logic has an inherent strain of predictability.

    I also don't apologize for banal posts. Entertaining you is not among my goals.

    > specious arguments aren’t difficult to concoct.

    If they're specious, they should be easy to de-construct.

    > Apple was very influential long before Anandtech existed.

    That's not the relevant standard. The standard is whether there's anything to report about them that's of interest to the readership. For that standard to be met, there must be some combination of technical depth and relevance to one's own priorities. To do that on a PC-oriented site, you either an extraordinary degree of depth, such as the recent coverage of their CPU cores, or to cover products that readers would seriously consider using.

    > It was the company that got consumers reliable removable storage

    But this isn't a tech history site. It's news, reviews, and in-depth coverage.

    > the founder of a site who has a strong track record for delivering high-quality tech
    > reporting is entitled to enough latitude

    Sure. He can write whatever he wants. So can the current contributors. Readers are also entitled to be annoyed by a preoccupation that seems unjustified.

    > All of this is patently obvious and therefore should not require comment.

    It's progress towards exposing the point of contention. That's at least something.
  • Oxford Guy - Sunday, August 15, 2021 - link

    ‘But this isn't a tech history site. It's news, reviews, and in-depth coverage.’

    It’s a tech site and tech history is part of that. Apple’s history is obviously relevant.
  • JoeDuarte - Monday, August 9, 2021 - link

    I usually find all the industry hype about "hyperscalers" pretty depressing. What a waste of talent, resources, energy, etc. Facebook is probably the best example. All those high priced engineers, all that money, thousands and thousands of servers, countless megawatts or gigawatts, and for what? So that idiots can post selfies? So that Facebook's Orwellian Truth Police can censor non-leftist discourse, like how they censored discussions of the novel coronavirus's plausible origins in the Wuhan lab that researched coronaviruses? That's what we need hyperscalers for? Why are people even working for such sleazy companies? It looks pretty unethical, and at best Facebook is a vehicle and amplifier for vanity, vacuousness, and narcissism.

    When I think of the cloud I think of extremely poorly optimized software thrown onto an overpriced instance, iterated millions of times across companies, teams, and time. We need a new generation of secure, sane, and innovative operating systems a lot more than we need a bunch of overpriced instances of Linux or Windows Server. The cloud and these big incumbents are stifling much needed innovation and progress in computing. All the hype and propaganda around the cloud and "open" projects run by huge soulless corporations and sleazefests like Facebook are obscuring the fact that we haven't made any progress in OSes in many decades, or programming languages.

    By the way, anyone know what she's talking about here: "do you remember walking through the data centers with all the LEDs, and they were just so perfect? At the time companies were manufacturing their own screws as if that was important!"

    Lights? What lights? And screws? What is she referring to? Someone made their own screws? Is that important?
  • martinpw - Monday, August 9, 2021 - link

    She mentioned SGI in that paragraph. Do a search for images of "SGI Origin 2000" for example. I think that is the kind of thing she is talking about. They were beautiful looking systems, lots of effort spend on aesthetics, almost everything custom made (eg screws) and then it just sits in a datacenter where almost nobody look at it. I remember getting tours of a couple of VFX render farms back at the time they used SGI machines - they definitely looked impressive, but I guess the point is that there was a lot of wasted money doing stuff like that. Then hyperscalers came along and stripped all that away, reducing the servers to the absolute barebones minimum required and just focused on performance.
  • mode_13h - Tuesday, August 10, 2021 - link

    At that time, I think the effort put into aesthetics was justifiable. Those companies already made workstations that had to look like they cost as much as an engineer's salary (because they did!), so they already had the design staff needed to make good-looking machines.

    And when you consider the base price of the hardware, in which virtually everything was custom-designed & all the software was custom-developed, the extra cost of a nice faceplate was probably less than a couple %. And if that helped win accounts by impressing suits who probably still used secretaries to take dictation on word processors, then it was money well-spent.

    Is it a good thing that we're past all of that? Absolutely. But, you have to look at such things in context.
  • Wrs - Tuesday, August 10, 2021 - link

    Gotta isolate design/engineering progress from lame uses by laypeople. Data centers used to be mainframes, and bandwidth limits forced many workloads onto exotic local workstations. When computing costs so much you're going to make sure it is productive and high priority. These days bandwidth is dirt cheap and racks are made of commodity hardware. For every Dropbox there's going to be an Instagram. It's still a much better place to be.
  • mode_13h - Tuesday, August 10, 2021 - link

    > and for what? So that idiots can post selfies?

    In spite of the corrosive potential of social media, it's hard to argue they're not technically interesting users. That's how I see it. You don't have to like Facebook's product to appreciate the technical challenges and solutions they devise, and those solutions have many other applications, too.

    > When I think of the cloud I think of extremely poorly optimized software
    > thrown onto an overpriced instance, iterated millions of times

    If you believe in capitalism, then why conclude that cloud-based software is any more poorly-optimized than in conventional deployments? If anything, the cloud billing model should be *more* incentive for developers to optimize highly-replicated instances than if they simply owned the hardware outright and had to pay only for power & cooling.

    > We need a new generation of secure, sane, and innovative operating systems

    If you had that, then you could still deploy it on the cloud... or not. It's orthogonal to the discussion, since there's no amount of optimization you could do in operating systems that would eliminate the need for cloud.

    > The cloud and these big incumbents are stifling much needed innovation
    > and progress in computing.

    How so? Evidence? It seems to me they're pushing hardest on power-efficiency, which is one of the main factors driving in favor of ARM & other post-x86 ISAs.

    > huge soulless corporations

    Uh, that's redundant. Like it or not, only huge corporations are big enough to drive real advances in computing hardware. Or maybe governments, in the case of China.

    > we haven't made any progress in OSes in many decades, or programming languages.

    How do you figure? Because Linux and Windows are still the main players? A ton of stuff has changed under the hoods of the operating systems, though.

    I can't even guess why you think programming languages haven't progressed. I'd point to Go, Swift, Julia, and Rust, as examples of languages that didn't exist a decade ago. And older languages, like Python and C++, have continued to evolve new features and performance improvements.

    Web Assembly and WebGPU are some new, enabling technologies to watch. Also, LLVM has really been a big story of the past decade, powering a lot of the advances in languages, tools, and programmable hardware.
  • FunBunny2 - Tuesday, August 10, 2021 - link

    " I'd point to Go, Swift, Julia, and Rust, as examples of languages that didn't exist a decade ago. "

    considering all of those are just 'translators' to C(++), not much 'progress'. Algol, COBOL, Fortran, Forth, Smalltalk, Prolog, and C The Universal Assembler of course, just about cover the possible approaches to running a Von Neumann machine. a few other hardware architectures have been experimented, but none got any traction. there's only so much fiddling you can do at the language level with a (nearly) monolithic hardware. and it all goes through C.
  • mode_13h - Wednesday, August 11, 2021 - link

    That's a lot of words to say essentially nothing. If your standard for a novel language is one that can't be transformed into multi-threaded C code, then you need to build some hardware not programmable in C. And that's nothing to do with what @JoeDuarte was saying.

    I dare you to try and convince any group of experts in languages I cited that their tool of choice is no better than C. Just make sure to wear some padded knickers.
  • GeoffreyA - Tuesday, August 10, 2021 - link

    Got to admit, the world is a more lamentable place with Facebook.
  • FunBunny2 - Tuesday, August 10, 2021 - link

    "Facebook's Orwellian Truth Police"

    yeah, right. Facebook has been the $$$ mover for the Right Wingnuts from the beginning. FB only, only began to rein in the propaganda in the last few months.
  • Oxford Guy - Wednesday, August 11, 2021 - link

    Left and right is a football game on to distract the plebs whilst the money flows out the back door.
  • mode_13h - Thursday, August 12, 2021 - link

    That's just what politically-disengaged tell themselves, to justify their apathy and ignorance. It's also the sort of thing parroted by government trolls hostile to democracies, in order to foster that exact sort of apathy.

    Politics matters. Be wary of anyone who says otherwise.
  • Oxford Guy - Thursday, August 12, 2021 - link

    The ancient Athenians stuffed the ballot box. Rich folk hired people to create votes (carved shells).

    Democracy continues to be the same today. Magic ad hom buzzwords like ‘troll’ won’t change that.
  • mode_13h - Friday, August 13, 2021 - link

    > The ancient Athenians stuffed the ballot box.
    > Rich folk hired people to create votes (carved shells).

    Contrary to what some people claim, voting security has come a long way. Paper ballots can be stamped with unique identifiers and a cryptographic hash, so that they're impossible to forge. And we have databases to track voters, so we can check that no one votes more than once.

    The biggest issues we face now are around campaign funding & political ads, particularly in the age of targeted digital advertising. Also, in the realm of creating apathy that suppresses turnout in key demographics.

    > Magic ad hom buzzwords like ‘troll’ won’t change that.

    Troll farms are a fact. They're operated by government and political actors. And why do they go to the trouble & expense? Because they correctly recognize that politics matters.
  • GeoffreyA - Saturday, August 14, 2021 - link

    "Democracy continues to be the same today."

    Sure, people don't have much choice and are misled by propaganda during election time, but with all its flaws, democracy is still better than its alternative, I'd say. I'm no expert on these things, but it's of interest to me, and I'd like to ask you, Oxford Guy, how can society be improved? Because solutions are what counts. Is there a better form of government than democracy? Or is some principle missing from the democracies of today, which, when discovered, will put the system right?
  • Oxford Guy - Sunday, August 15, 2021 - link

    ‘Oxford Guy, how can society be improved? Because solutions are what counts’

    The first step is being willing to accept reality, rather than comfortable/convenient Pollyanna beliefs that are spoon-fed to the masses to ensure compliance with the plutocratic economic engine.

    Ask Michael Connell.
  • GeoffreyA - Sunday, August 15, 2021 - link

    So that means accepting that society is split into three groups: the high (who are ruling), the middle (who, paying lip-service to equality, are trying to overthrow the high by using the proletariat), and the proletariat (whose lot never really changes much). Is there some way to improve this? Using an AI perhaps?
  • mode_13h - Monday, August 16, 2021 - link

    Conspiracy theory nuts don't have solutions. They're invested in making problems seem so insurmountable that only a populist demagogue could possibly turn things around. That's why his cynicism is so pernicious.

    > The first step is being willing to accept reality

    You don't have any unique claim to reality. And what I've seen from you is a pattern of working an agenda, rather than being invested in actually finding good information, solid facts, and a firm grip on reality. That should cast a dark cloud of suspicion over your worldview.

    If you believe in reality, then you should see that it's not all bad. I can't speak to your specific circumstances, but the annoying thing about capitalism and the current economic order is that it's not as broken as you'd have us believe. And your failure to accept that shows your agenda is not actually rooted in accepting reality.
  • Oxford Guy - Tuesday, August 17, 2021 - link

    'Conspiracy theory nuts don't have solutions.'

    Should we institute a system by which a quarter is deposited for each fallacy (flamboyant like that one, or otherwise) you post, in lieu of a good-faith argument and relevant information?
  • mode_13h - Friday, August 20, 2021 - link

    This self-appointed refereeing doesn't seem to be going so well for you, does it?

    A referee needs to be established as competent and disinterested. The latter is clearly absent and you've done nothing to show that your allegations are consequential and in good faith.

    I've already told you I'm not playing your game, but the reason is that it seems more of a distraction than truly contributing to the discourse. The only remedy for bad internet arguments is better ones. If you have better arguments, let's hear them.

    To the extent that internet arguments are winnable, victory never hinges on a technicality. Your attempts to lawyer me into submission reveal a failure to appreciate this.
  • mode_13h - Monday, August 16, 2021 - link

    > So that means accepting that society is split into three groups

    That's an unfortunate oversimplification. The main problem with it is that it's disempowering. No one is completely blameless, but that also means they're not completely powerless. We've seen a lot of political mobilization, in recent years, and with real consequences!

    Democracy is a messy affair. Nobody gets everything they want and certain groups indeed have outsized influence, but real change does happen.

    > Is there some way to improve this? Using an AI perhaps?

    Deus ex machina.

    I won't say it'll never happen, but I think sitting around and waiting for it is a luxury we don't have. And there's no reason to expect that it won't be used in an unfair way.

    All I can say is to engage and get involved. People should invest in good journalism. That the US Constitution lists freedom of the press in the 1st Amendment, recognizes just how crucial it is for good governance.
  • GeoffreyA - Monday, August 16, 2021 - link

    Maybe we can borrow the MAGI from Ritsuko Akagi and get a head start! ;)
  • mode_13h - Tuesday, August 17, 2021 - link

    Also, I don't see why people would even trust that AI is unbiased, that it's not somehow being manipulated, that it hasn't been hacked, or that it hasn't gone psycho.

    People have enough trouble trusting other people. But, at least we feel like we understand people. A machine is just such a huge unknown and so unknowable. As long as the people of tomorrow are anything like the people of today, I don't think they'll ever willingly be ruled by an AI. If you think about all the anti-government sentiment in the zeitgeist, just imagine how people would react to the prospect of being "enslaved by machines".
  • Oxford Guy - Tuesday, August 17, 2021 - link

    'But, at least we feel like we understand people.'

    The royal we again, is it?
  • mode_13h - Friday, August 20, 2021 - link

    No, it was an obvious speculation about humanity. Don't be so obtuse.
  • GeoffreyA - Tuesday, August 17, 2021 - link

    The cynic in me feels that only a machine could run society well, but I know that'll never come to pass and, I hope, don't fancy it myself. Nobody's going to accept an AI openly ruling, even if it were proven to be moral, effective, etc., etc. Then, the machine-going-bonkers sentiment is strong in us. But I've got a feeling they'll be more like TARS than HAL!

    Well, an AI could end up ruling by operating in an invisible fashion, manipulating man without his knowing it. Perhaps today's Facebook and Google are rudimentary versions of that. Or there's a possibility that if the AI were to put on human personality, particularly a humorous or inspiring one, and demonstrated goodwill over time, it could gain many a follower under a democratic setup. AI for president, anyone?

    In short, humanity will not abide open oppression, but can be successfully manipulated by using a more sophisticated system that grasps and plays into our weaknesses, desires, and vanities.
  • mode_13h - Friday, August 20, 2021 - link

    > an AI could end up ruling by operating in an invisible fashion,
    > manipulating man without his knowing it.

    Right. Humans are already pretty good at manipulating each other. A machine super-intelligence could study us and our manipulation techniques, then work behind the scenes to shape events in its favor.

    There's a recent TV series which explores exactly that, called Next. The drama seemed a bit forced, at times, but I thought it was worth watching.

    https://www.imdb.com/title/tt9315054/

    > if the AI were to put on human personality, particularly a humorous or
    > inspiring one, and demonstrated goodwill over time, it could gain many a
    > follower under a democratic setup. AI for president, anyone?

    You could certainly have a charismatic AI that garnered a following, but the problem is the unknowability of what truly lies beneath it. Even in the case of human politicians, voters like to know their backstory and try to gain a sense of their motives, in order to start trusting them. With an AI, there's not only the risk that its charisma is a facade, but also that it's being operated or manipulated by other humans.

    > humanity will not abide open oppression,

    Something about the recent rise of authoritarianism seems to argue against this.

    > but can be successfully manipulated by using a more sophisticated system
    > that grasps and plays into our weaknesses, desires, and vanities.

    The ancient Greeks knew this. It's the reason transparency and accountability are so important. The only way to combat the more base elements of mass appeal is to focus on actions and results. This brings us back to the importance of good journalism.

    Also, people need to actually care. Good governance can't happen without some effort on the part of the governed.
  • GeoffreyA - Sunday, August 22, 2021 - link

    Thanks for that recommendation. The premise of a constantly improving rogue AI reminds me of Ghost in the Shell, but it's been a while since I've seen that, so I forget the details.
  • GeoffreyA - Sunday, August 22, 2021 - link

    And actually, concerning an AI politician, I realise there's a slim chance it'll work. I can picture the fear mongering and headlines already: "Matrix AI runs for President" or "Will Siri now sit in White House?" And I can picture those agitators working overtime, when for all we know, the AI might be honest. And it goes back to the question of what differentiates human from AI?

    And concerning good journalism, yes, very important. Without accurate information, we're wandering in the dark. Truth, it mends much, both minds and societies (along with actions). "When the press is free, all is safe."
  • mode_13h - Monday, August 23, 2021 - link

    > And it goes back to the question of what differentiates human from AI?

    Well, what is an AI? I can tell you a lot about a human. Fundamentally, a human has one wet brain between their ears.

    But, an AI? How can we truly know it? Do we get to inspect its source code and its deep learning model? Even if we do, how do we know that's truly what running, when is supposedly issuing statements and orders? And how can we know it hasn't been hacked?

    Also, if it can be inspected, does that mean it can be cloned? And if adversaries can clone it, can they then conduct experiments to see how it can be fooled or misled?

    All of this is incredibly problematic. I think humans will only accept being ruled by humans. Although, that doesn't eliminate the possibility of an AI working behind the scenes (which is not entirely different from the way human and corporate entities try to influence politicians).
  • GeoffreyA - Monday, August 23, 2021 - link

    In full agreement that humans won't accept being ruled by anything non-human. I suppose what I'm trying to get at is not so much ruling and presidents but rather what we are. Is the mind really computable (I believe it is) or is there something else? Perhaps a soul? The evidence suggests that consciousness is being implemented in the anterior cingulate cortex and there's some sort of recursive circuitry at work. Which makes sense. (Even with a Creator.)

    Then, if the brain is computable, the same criticisms of AI can possibly be turned round on us. I'm not disputing that a human is a human, and I myself wouldn't trust an AI. I'm struggling to put this notion into words. Let's say the same abstraction hiding our inner workings from others is also hiding it from ourselves. That is, we ourselves might be some sort of Chinese room/Turing test setup and not know it. And even if that were the case, it wouldn't make our pains and joys any less real. Toothache, we know, is very real.

    So my question should rather be: when human and AI can't be told apart from external actions, and both report consciousness, is there something that differentiates them in some deeper way? And not the fact that one is brain and the other, mind running on silicon.
  • mode_13h - Tuesday, August 24, 2021 - link

    > is there something that differentiates them in some deeper way?

    Yes. For better and for worse, humans are inextricably tied to our biology and our environment. And I don't just mean "tied" in the sense that they're physically bound, but actually that our body & environment affects our mood and cognition.

    Another interesting point of neurochemical consideration is that of mood. Certainly, the concept of moods can be introduced to an AI, but getting it tuned just right so that it affects cognitive processes in a moderate and sensible way would be tricky. We don't know what it's like to function in the world without moods. I'm not sure AI would seem "normal" to us, without moods, however you certainly don't want it to be *too* moody!
  • mode_13h - Tuesday, August 24, 2021 - link

    I guess we could also get into the concept of mortality that an AI wouldn't have. The sense of our own mortality can inspire some of the best (and worst) behaviors in humans.
  • GeoffreyA - Wednesday, August 25, 2021 - link

    The physical manifestation of a thing can make all the difference, a bit like Rocket Lake. The body is tied to its varying biology, agreed. But I'd venture to say there's a schematic of the Human Machine behind it, which, at least in pieces, could be done in a different system. Crude example: iron instead of bones. If so, if we could reverse engineer the secret of the mind (say, find Skylake's design by analysing the silicon), it's conceivable we could implement it again; and so, in principle at least, human and AI are on the same footing. In practice, there'll likely always be a discrepancy between the two. (A further point for thought: I've always felt we are "non-portable." We're tied to the physics of our world in every way possible, even the four-dimensional spacetime.)

    As for the mortality question, if mind uploading were impossible (perhaps quantum constraints), they'd also be mortal; even computers break. Even better, if created on biology, they'd age as we do, with all its pains and joys. And truly, everything is subject to entropy this side of Existence.
  • GeoffreyA - Wednesday, August 25, 2021 - link

    And good point about mood. We wouldn't want them too moody!
  • mode_13h - Thursday, August 26, 2021 - link

    > And good point about mood. We wouldn't want them too moody!

    Well, mood serves a functional purpose. It's clearly a powerful tool. It also plays a role in personality. And I wonder if an AI without moods would seem less human or relatable, making it ultimately seem more alien and less trustworthy.
  • GeoffreyA - Friday, August 27, 2021 - link

    Spot on. Even in human interaction, especially with those one loves or cares about, lack of mood or being apparently emotionless gives a really awful feeling. You feel chilled or hurt. Varying mood, within reason, is what we often grow fond of in someone, especially in matters of the heart!

    Well, an AI could simulate mood by tying personality weightings to metrics like power usage, temperature, etc.
  • mode_13h - Saturday, August 28, 2021 - link

    Here's where I'm completely out of my depth, but I think there are some neuro-chemical factors involved in mood. Simulating the effects of these mechanisms on cognition, as well as working out a reliable way to regulate mood just adds another layer of complexity on the already daunting challenges of developing human-like AI.
  • GeoffreyA - Saturday, August 28, 2021 - link

    Just a layman here myself but it's fun trying to imagine how the world works. I'd guess that if we followed the plan of the brain, once deciphered, it would have mood, etc., as an elegant side effect of how it operates, based on neurochemical levels, as you pointed out.

    Though it may seem I'm all about AI, my chief concern is man, woman, and human nature. So, translating into how an AI could work, for me sheds light on ourselves (who we are and where we came from). Of concern, though, is the morality of this whole thing. If/when consciousness is brought about, we would be bringing life and mind into a world of joy and sadness. Have we got the right to do that? As that line, "Did I solicit thee from darkness to promote me?" the AI might say the same to us.
  • GeoffreyA - Saturday, August 28, 2021 - link

    By the way, here's an interesting article I found yesterday.

    https://www.quantamagazine.org/mental-phenomena-do...
  • mode_13h - Monday, August 30, 2021 - link

    You say "plan of the brain", but I'm sure that it's a lot more involved than simply mapping the schema of neuron connectivity and dendrite thickness.

    > If/when consciousness is brought about, we would be bringing life
    > and mind into a world of joy and sadness.

    Well, that's not obvious to me. How do we know that an AI is experiencing things like joy and sadness in the same way as we do, if at all?

    > Have we got the right to do that?

    People have babies in all sorts of dire and precarious situations, so I guess the answer to that question is pretty much whatever we decide it is. And a baby can certainly know pain and hopelessness in the same ways as we understand them, whereas we don't know if an AI will.
  • GeoffreyA - Tuesday, August 31, 2021 - link

    "How do we know that an AI is experiencing things like joy and sadness in the same way as we do, if at all?"

    Good point. Perhaps I was anthropomorphising the AI too much. Still, crafting emotion and sentience does open a possible door to sadness and happiness. And regarding babies, good point as well. It did occur to me. But I'd argue that bringing about a new life form carries even bigger responsibility.

    Touching on the brain, my guess is that there's some trick of multiplexing the senses, thoughts, and feelings into an overlapping, carefully-synchronised window, tied to short-term memory, and that's how consciousness comes together. Perhaps illusion, but it works. (Illusion? Well, think toothache.) I don't deny that quantum phenomenona or even time could have a hand in it, but prefer the simpler explanation.
  • Supercell99 - Monday, August 9, 2021 - link

    Any wonder intel falling a decade behind?
  • Wrs - Tuesday, August 10, 2021 - link

    How's that? Hoping to stay relevant, both Amd and Intel are OCP members. Intel fell behind when it made a choice 14 or 15 years ago that funneled tons of money toward the ARM ecosystem, eroding Intel's capital moat and thus process advantage. Only fortunate that ARM hasn't overrun the data center
  • mode_13h - Tuesday, August 10, 2021 - link

    > Intel fell behind when it made a choice 14 or 15 years ago that funneled tons of money
    > toward the ARM ecosystem

    Huh? You mean when the sold off StrongARM? Aside from that, I can't imagine what you'd mean. It's not like Intel didn't *try* to push x86 into the IoT and mobile space, but it was a square peg in a round hole.
  • Smartcom5 - Tuesday, August 10, 2021 - link

    Ever heard about Atom?

    Read:
    ExtremeTech.com • How Intel lost $10 billion — and the mobile market
    https://www.extremetech.com/extreme/227720-how-int...

    ExtremeTech.com • How Intel lost the mobile market, part 2: the rise and neglect of Atom
    https://www.extremetech.com/computing/227816-how-i...
  • mode_13h - Wednesday, August 11, 2021 - link

    Atom didn't have the perf/W, compared to ARM alternatives. ARM was already the dominant player and x86 couldn't even hold its own, much less actually beat ARM on its own turf. I know that article argues a different point, but that still doesn't explain the failure of Intel Edison nor the ascendance of ARM in the cloud.

    > Back in 2016, we didn’t know Qualcomm had been ruthlessly enforcing licensing
    > and purchasing terms that made it effectively impossible for manufacturers to
    > offer Intel-based mobile devices.

    Ah, Intel. Maybe now you understand how it feels...
  • Oxford Guy - Wednesday, August 11, 2021 - link

    Good ole Atom...

    Design a slow chip to brag about saving power and pair it with a grotesquely inefficient chipset.

    I would say ‘only Intel’ but AMD gave Intel a run for its money with the FX 9590.
  • Wrs - Tuesday, August 10, 2021 - link

    Sold off strongARM, couldn't get x86 into iPhone. Stuck with x86, got Atom efficiency competitive actually (thank process node), but rest of platform consumed excess space and power. ARM designers and foundries raced ahead on the idle power
  • Wrs - Tuesday, August 10, 2021 - link

    Could rewrite history if only Intel wasn't stubborn about reusing so much of their desktop/notebook assets, and worried about cannibalizing high margin SKUs. Had they gained a commanding share of mobile back in 07-08 there wouldn't be all that $ to buy ASML equipment for leading edge competing fabs
  • FunBunny2 - Tuesday, August 10, 2021 - link

    "Had they gained a commanding share of mobile back in 07-08 there wouldn't be all that $ to buy ASML equipment for leading edge competing fabs"

    is that to assert that, prior to 07-08, Intel was fully vertically integrated? made there own tools? like Henry Ford did at River Rouge - iron ore, coal, sand, wood in one end and Model T out the other?
  • Wrs - Wednesday, August 11, 2021 - link

    Not at all, every leading edge fab costs $billions because ASML can charge so much for individual tools that no other company seems able to make right. The only companies that buy these tools for research and eventual production are wealthy chipmakers that see a future chasing the leading edge. Just before the mobile breakout, that was basically Intel, IBM, and AMD (piggybacking off IBM). It's because Intel stumbled badly on mobile that Samsung LSI and TSMC - formerly relegated to budget, trailing-edge fabrication - saw their futures brighten considerably.
  • thestryker - Tuesday, August 10, 2021 - link

    I've found it pretty interesting over the last decade how all of these huge companies have managed to keep working together without splintering off with their proprietary setups. The overall amount of cost savings must have been immense for installations and upgrades.

    Really great to catch an interview where the interviewee is clearly excited about the work they're doing. Looking forward to the interviews you've all got lined up.
  • mode_13h - Tuesday, August 10, 2021 - link

    > I've found it pretty interesting over the last decade how all of these huge companies
    > have managed to keep working together without splintering off with their proprietary setups.

    It pretty much comes down to the fact that hyper-scalers recognize the efficiencies & other benefits of standardization, and have enough clout to force vendors to play ball.

    I think Google was a late-comer to OCP, because they'd already invested heavily in their own proprietary designs, to the point that they saw it as a competitive advantage. But, I think OCP gained enough momentum that Google probably even started to find vendors less interested in placing bids or maybe quoting higher prices than available OCP solutions.
  • abdullh.atas - Wednesday, August 18, 2021 - link

    super !

Log in

Don't have an account? Sign up now