Sherdog PC Build/Buy Thread, v6: My Power Supply Burned Down My House

What does the other screen show when agame is active ?
Whatever you want it to. I run 3 screens and game on the middle monitor. The one to the left has youtube or some other sort of media going. The right one has temperature monitoring software.
 
Hey all, I am looking for some thoughts on whether my intended PC is well balanced; especially wrt to video card.

My usage is general desktop use. I have been using Ubuntu as my desktop for the last 7 or 8 years, but I might go to Windows next time around, that's still undecided. I do mostly things like:

- general life stuff on the Internet, with uploads to Youtube and maybe Google Docs being the most heavyweight web usage
- misc utility stuff like video transcodes, ebook management etc
- occasional software development if I can't get stuff done at work
- video playback on a 1440p monitor; but I want this machine to be dual 4k ready for when it gets cheap enough

I don't really game, though if I go Windows I might play the occasional driving game, but I want the desktop to feel snappy, and moving around windows with video in them should be smooth etc.

The parts I am thinking about are:

CPU - Intel i7 8700
RAM - 16GB (on the high side for general usage but I sometimes need VMs)
SSD - 256GB for OS and apps, probably M.2
HDD - 2TB 7200rpm for media and long term storage, is 2TB a reliable size?
Graphics - Nvidia GTX 1050 Ti seems to be the minimum card I can get with graphics acceptable for today? Will this be quiet in normal use? Nvidia seems to have had better drivers for Linux historically on average than ATI, is this still true?
Power - 500-ish W

I'd welcome any thoughts. Especially on the graphics card, and on an appropriate amount of RAM.
 
Hey all, I am looking for some thoughts on whether my intended PC is well balanced; especially wrt to video card.

My usage is general desktop use. I have been using Ubuntu as my desktop for the last 7 or 8 years, but I might go to Windows next time around, that's still undecided. I do mostly things like:

- general life stuff on the Internet, with uploads to Youtube and maybe Google Docs being the most heavyweight web usage
- misc utility stuff like video transcodes, ebook management etc
- occasional software development if I can't get stuff done at work
- video playback on a 1440p monitor; but I want this machine to be dual 4k ready for when it gets cheap enough

I don't really game, though if I go Windows I might play the occasional driving game, but I want the desktop to feel snappy, and moving around windows with video in them should be smooth etc.

The parts I am thinking about are:

CPU - Intel i7 8700
RAM - 16GB (on the high side for general usage but I sometimes need VMs)
SSD - 256GB for OS and apps, probably M.2
HDD - 2TB 7200rpm for media and long term storage, is 2TB a reliable size?
Graphics - Nvidia GTX 1050 Ti seems to be the minimum card I can get with graphics acceptable for today? Will this be quiet in normal use? Nvidia seems to have had better drivers for Linux historically on average than ATI, is this still true?
Power - 500-ish W

I'd welcome any thoughts. Especially on the graphics card, and on an appropriate amount of RAM.
If you're not using it, build here:
https://pcpartpicker.com/list/

Generally speaking, for gamers, this build is slightly imbalanced: favoring the CPU. For most gamers, it would make more sense to get the i5-8400 or i5-8600K (even if you don't overclock) and devote the additional $100-$150 towards your GPU. However, that being said, this depends. A lot of online MMO games tend to be more taxing on the CPU; especially when you pull a crowd in a single area/instance (ex. massive faction battles in WoW). Most aren't all that demanding on the GPU because the game developers don't want to several limit their market. Additionally, GPUs age much more quickly than CPUs, so some gamers invest heavily in the CPU in order that they can upgrade the GPU once or twice before the CPU falls off, failing to meet hardware prerequisite minimums, and must be replaced.

In your case, you only mentioned the Driving game, and the most demanding thing you specified which demands a minimum of performance is 4K video playback output (since video transcoding on weak hardware merely means more time waiting), so I like the balance. Definitely go CPU heavy. A GTX 1050 should be more than enough to suffice for your needs, but the GTX 1050 Ti is in the same perfect sweet spot for your demands, but it's a better value right now for you due to the 4GB of VRAM for a mere ~$35 upcharge. So, again, I think you've done your research, and that is a sound decision.

The GTX 1050 Ti will be quiet in normal use, especially if you downclock it (always an option). The dual or triple fan open-air designs will tend to perform the best in terms of acoustics if you roll back their factory overclocks to reference levels, and thus keep the fans operating at a minimum RPM level. This is because despite the additional fans you are often getting the best cooling-to-noise performance. This can get more complicated if you're working with a really small ITX case, depending on how the hot air is evacuated from the case, but ultimately, the best reference are all the reviews out there of specific models for each GPU with their dB measurements.

There is a specific GTX 1050 Ti model if you really want a silent experience-- the fanless Palit KalmX. Good luck trying to find one, though. You'll probably end up paying a serious premium if you even can:
https://www.techpowerup.com/reviews/Palit/GeForce_GTX_1050_Ti_KalmX/29.html
title.jpg


16GB is not at all RAM heavy, anymore, not even for gamers. 8GB is enough, but I'm increasingly seeing that just meeting the minimum for recommended game specs. So, again, I like your choice. In fact, I'd urge you to ensure that you have two additional RAM slots available on your motherboard for later expansion.

256GB is perfect for an OS/app drive. Keep in mind that m.2 drives aren't necessarily faster than SATA III SSDs. They simply can be. The Samsung 960 EVO is the no-brainer if you're after the pinnacle of performance without costing an arm and a leg. The advantages of the more expensive Intel superfast drives are purely theoretical for pretty much everyone, and a horrible value even for those who do benefit. Meanwhile, some of the cheapest SSDs aren't really all that much faster than the latest HDDs. The Samsung 850 EVO (available as either m.2 or SATA III) is again the clear winner in performance/cost ratio if you step below the Samsung 960 EVO above. Reference here:
http://ssd.userbenchmark.com/

3TB is best bang-for-your-buck among HDDs. The failure rates are pretty constant up to 6TB, now, I believe. I don't think it's until you hit 8TB+ that you see that classic 'bleeding edge' failure rate start to tick up.

500W is sufficient, but pay attention to cable connectors. A big potential downside to the well-made lower wattage PSUs tends to be this limitation to later expansion, but for most this isn't an issue. Stick to Corsair or EVGA if you want to KISS it. Otherwise, reference here, because this is one of the more esoteric components, and there is a massive disparity in quality and power delivery between PSUs of the same wattage:
http://www.realhardtechx.com/index_archivos/PSUReviewDatabase.html


On the whole this is a godawful time to build a PC. GPU prices are ridiculous on the open component market: literally about double what they should be. RAM prices are at their worst in a yearly cycle, and it's a bad year among years. Even the case market has seen a price inflation where cases that were $40 two years ago are suddenly $60-$70 on the regular today. It's all bad. So you might consider the AIO market. Example:
CYBERPOWERPC Gamer Xtreme GXi10980CPG Desktop (Intel i7-8700K 3.7GHz, 8GB DDR4, NVIDIA GeForce GTX 1050 Ti 4GB, 2TB HDD, WiFi & Win10 Home)

Finally, NVIDIA is expected to start releasing their next generation of "GTX 11" GPUs at some point this year:
NVIDIA's next-gen: Turing this year, followed by Ampere
 
Oh, I probably should mention that unless the Microsoft driving game you're talking about is Forza Motorsport or Project CARS you don't actually need the GTX 1050 or GTX 1050 Ti for your workflow.
  • The onboard "Intel UHD 630" graphics built into the i7-8700 can handle 4K video playback (just not multiple 4K monitors).
  • Unless "software development" involves 3D rendering you won't likely won't benefit a great deal from the discrete GPU.
  • Video transcoding also benefits mostly from a stronger CPU, more RAM, and faster drives. Same for eBook management and webware use.
  • Even photo and video editing, unless it involves 3D modeling, won't benefit much from GPU acceleration.
I mention this to offer an alternative consideration to you. You could forego the GTX 1050 Ti altogether.

Alternatively, you could invest in an even stronger Intel CPU, the i7-7820X, but that doesn't include onboard graphics, so you'll need to buy a discrete GPU anyway. This CPU is just $460, right now, so it's only $160 more than the i7-8700, but it has two additional cores, two additional RAM channels, nearly 8x the L2 cache, and can overclock if desired. Overall it's nearly 40% more powerful at stock:
http://cpu.userbenchmark.com/Compare/Intel-Core-i7-8700-vs-Intel-Core-i7-7820X/3940vs3928
http://www.cpu-world.com/Compare/525/Intel_Core_i7_i7-7820X_vs_Intel_Core_i7_i7-8700.html

The i7-7820X doesn't include its own CPU cooler, but the upside there is how superior the cooling is offered by humble $30 aftermarket cooler offers. The motherboard for it will probably also carry a slight premium, and may require an additional 100W of PSU for safety, but that's your high in the high-middle-low:
  • Low: i7-8700 + LGA 1151 motherboard
  • Mid: i7-8700 + LGA 1151 motherboard + discrete GPU*
  • High: i7-7820X + LGA 2066 motherboard + GTX 1050 Ti 4GB + aftermarket CPU cooler
*NVIDIA GT 1030 2GB or AMD RX 550 (512) 2GB the cheapest available with a worthwhile performance upgrade (95%+) over the Intel UHD 630 graphics
 
Last edited:
Does anyone here do encoding work or basic video editing work?! I'm stumped as to why my PC is sluggish for 4K editing, should I upgrade the RAM?

i7 5820K OC'd to 3.9ghz
32GB DDR4 3000mhz
Nvidia MSI 1080TI
2 Samsung Evo 500 SSD's

I don't want to get a E Core, that would mean getting another PC altogether. What's my best bet for immediate performance gains? CPU or RAM? Or another SSD?
 
Apple is moving on from Intel because Intel isn’t moving anywhere
The Verge said:
A report from Bloomberg this week has made public something that should already have been apparent to tech industry observers: Apple is planning to replace Intel processors in Mac computers with its own chips starting sometime around 2020. The two California companies have enjoyed a long and fruitful partnership ever since Apple made the switch to Intel CPUs with the 2006 MacBook Pro and iMac, but recent trends have made the breakup between them inevitable. Intel’s chip improvements have stagnated at the same time as Apple’s have accelerated, and now iPhone systems-on-chip are outperforming laptop-class silicon from Intel’s Core line. Even if Intel never cedes its performance crown, the future that Apple is building will invariably be better served by its own chip designs.

Apple’s decision to ditch the world’s most popular CPU line for laptop and desktop computers may seem radical, but there are a number of key factors that actually make it obvious and unavoidable.

INTEL’S STAGNATION
Attend any major tech exhibition and you’ll find Intel announcing or reannouncing mildly improved processors. Whether you’re at IFA in Berlin, CES in Las Vegas, or Computex in Taipei, the spiel is always the same: the future is wireless, battery life matters to everyone, and there are a lot of people with five-year-old PCs who might notice a difference if they buy a new Intel-powered computer. It’s all painfully incremental and out of sync with Apple’s product cadence. Apple will give you, at most, two years with an iPhone before enticing you into upgrading, whereas Intel is trying to convince people with PCs that are half a decade old to do the same.

In the past, Intel could rely on microarchitecture changes one year and production process shrinkage another year to maintain its momentum of improvement. But the infamous Moore’s Law sputtered to an end back in 2015. Intel is approaching the limits of what’s possible to achieve with silicon, and it hasn’t yet figured out its next step. The chart below, compiled byAnandTech, illustrates Intel’s predicament well. Notice how long the 14nm process node has endured, the question marks next to the release window for 10nm chips, and the almost total absence of a future road map. In previous years, Intel’s ambitious plans would be known well in advance. (The company hasn’t grown more secretive; it just doesn’t seem to have any secrets left.) And without the power efficiency gains that come from building smaller chips, Intel just can’t compete with ARM processors designed for efficiency first.

anandtech_intel_roadmap_copy.jpg

APPLE’S AMBITION
If there’s one unifying theme to define everything that Apple does, it’s integration. From integrating components on a logic board to integrating an entire ecosystem of Apple devices like the iPhone, Macs, AirPods, and HomePod to integrating supply and distribution lines under its centralized control. Apple started designing its own iPhone chips because it didn’t want to be dependent on Qualcomm. A year ago, it started making its own graphics processors to shed its reliance on Imagination Technologies. Apple also created its own Face ID system, acquired the maker of its Touch ID system, and it was recently reported to be secretly developing its own MicroLED screens for the Apple Watch.

Apple will tell you that it’s obsessed with delighting the consumer, crafting elegantly designed objects, or some other lofty aspiration, but the company’s overriding ambition is to control every last minute aspect of its products. The Intel chips that have been at the heart of MacBooks and Macs for over a decade aren’t minute; they’re central to how each computer can be designed and engineered. Apple has stuck with them for so long because of Intel’s once-insurmountable lead, but the way we use computers is changing, the workloads on a CPU are changing, and Apple’s A-series of chips are better designed to handle that new world of computing. Plus, the iPhone has shown the advantages of designing hardware and software in harmony, requiring smaller batteries and less RAM than comparable Android rivals.

THE IOS LAPTOP
Apple’s macOS, the operating system that runs on Intel’s x86 architecture, is now legacy software. This may sound like a blunt allegation to make, given that Apple still sells plenty of MacBooks and iMacs, but the development of that OS within Apple seems to have halted entirely. Today, macOS feels like it’s in maintenance mode, awaiting a widely anticipated change that will produce a unified iOS and macOS operating system, with iOS taking precedence.

If this is true, it’s another step towards the next great Apple machine: a consumer laptop running iOS. Call it the MacPad, or revive the name iBook. Use the trackpad the way 3D Touch is used on iOS devices to easily move the cursor. (And build more tricks into it.) I’ll buy it. https://twitter.com/verge/status/980872061966082049 …

Mobile computing has firmly established itself as the predominant mode of use these days, and that trend will only grow more pronounced in the future. Apple’s primary software focus is rightly fixed on iOS, which happens to run on ARM instructions, not Intel’s x86. So, if Apple is indeed intent on bringing iOS up into its less-portable computing line, and if it has chips that offer comparable performance to Intel’s consumer CPUs (which it does), why not build all of that on top of its own processor? Whether it’s presented as a new-age iPad Pro or MacBook Air, a device that combines the strengths of iOS and the convenience of a clamshell design with a generous touchpad is something a lot of people would love to have. By pursuing this course of action, Apple gets to scratch its vertical integration itch while sating existing demand.

THE MOBILE OFFICE
The thing that makes it possible for Apple to even contemplate running its lithe mobile operating system on its more powerful computers is the way our computing habits are changing. Not only are we using mobile devices more often than desktop ones for entertainment, but we’re now doing most of our work on phones as well. You can be a professional photographer with just a Pixel 2, for instance. The phrase “phoning it in” certainly has a whole different ring to it in 2018 than it did at the beginning of this decade.

As investment and development dollars continue flowing into the dominant mobile platforms — Android and iOS — it’s logical to expect that every useful desktop application that hasn’t yet been adapted to them already is on its way there. Sure, Intel is likely to retain its dominance at the very high end of computing, but for the vast majority of people and situations, iOS will soon be able to provide all that users want. And once the software reaches that point, Apple will want to match it with hardware that’s powerful and ergonomic enough to take advantage.

ipad-2.0.jpg

It’s not just Apple that’s moving away from Intel processors. Google has been hiring and dabbling with its own custom chip designs, and Microsoft and Qualcomm this year started pushing Windows on ARM as an alternative to the typical Intel-powered laptops. The whole technology world is moving to developing and designing for mobile applications first, and Intel’s desktop roots keep holding it back from being competitive in that expanding market.

Apple’s moving on because Intel’s standing still.
This is monumental. The world is set to be ruled by these mobile APUs.

@Prutfis: come and see.
 
Apple is moving on from Intel because Intel isn’t moving anywhere

This is monumental. The world is set to be ruled by these mobile APUs.

@Prutfis: come and see.
The Berry wasn't too interested in this thread, so I merged it back to here. I would have put it in Jefferz's "Other PC Products" thread instead of this build thread, but I'd just made all the posts in the past few pages discussing the profound implications of the full Fortnite port to iOS.

@method115
Happened sooner than I expected, but this is one of the first major steps to follow the port resulting from the changing hardware landscape I was discussing above. This ends a 12-year relationship between Mac and Intel since Mac cried uncle a dozen years ago. How things have changed.
 
Does anyone here do encoding work or basic video editing work?! I'm stumped as to why my PC is sluggish for 4K editing, should I upgrade the RAM?

i7 5820K OC'd to 3.9ghz
32GB DDR4 3000mhz
Nvidia MSI 1080TI
2 Samsung Evo 500 SSD's

I don't want to get a E Core, that would mean getting another PC altogether. What's my best bet for immediate performance gains? CPU or RAM? Or another SSD?
Jesus, your machine is a beast. I'd have to assume you have some software or setting issue. I don't know. Maybe what you perceive is "sluggish" is just how slow 4K rolls in?

When I've consulted with the professional video editors they have always quoted obscene figures for RAM (as in 128GB) as reasonable expansions above 32GB, but if I'm not mistaken, the primary benefit from that added amount is the size of preview video the RAM can store and playback while the machine is still rendering the file so that you can review how the job is going as you're waiting on it to finish.

This guy has an incredible similar setup to yours, so maybe there is something you can divine in his benchmarks. @val_lixembeau might appreciate the chart and his comments starting at 6:09 that demonstrate how a "pure" video edit (such as a transcode) without added effects receives zero benefit from GPU acceleration: specific evidence to buttress my advice offered above.

 
Last edited:
The Berry wasn't too interested in this thread, so I merged it back to here. I would have put it in Jefferz's "Other PC Products" thread instead of this build thread, but I'd just made all the posts in the past few pages discussing the profound implications of the full Fortnite port to iOS.

@method115
Happened sooner than I expected, but this is one of the first major steps to follow the port resulting from the changing hardware landscape I was discussing above. This ends a 12-year relationship between Mac and Intel since Mac cried uncle a dozen years ago. How things have changed.

The only direction Apple can go is ARM unless they're going to come up with a complete new architecture and convince developers to code for it. I can't see that happening. The only other option is Via, they do have the license for x86 from Intel.
 
The only direction Apple can go is ARM unless they're going to come up with a complete new architecture and convince developers to code for it. I can't see that happening. The only other option is Via, they do have the license for x86 from Intel.
Definitely. They've already gone 64-bit based ARM. I'm not sure how much that is going to matter. RISC is proving to be nearly twice as efficient as CISC on a per-wattage basis running even real-world applications. There's more than one way to skin a cat.

Right now, obviously, this won't cripple Intel as a practical market impact. The overwhelming majority of laptop and desktop computers sold across the world are not Apple. But that picture gets more interesting just as it does with cell phones when you throw out all the Celeron/Pentium/i3/i5 models, and just look at the i7's, particularly in the mobile space: like comparing only the more apples-to-apples (pardon the pun) Samsung flagship phones (Note/Galaxy S) vs. iPhone sales. If Apple can reign supreme in that space they rule the world, I'm figuring, just as Intel has the past generation when it separated from AMD.

Ultimately, this isn't just about Apple, obviously, but the whole evolving ARM vs. x86 battle you espied. Apple is just the leader, right now, but they definitely don't move the majority of the world's devices. Once the Android chipsets also catch up, made by those cheap Chinese manufacturers, some of whom are even designing their own ARM chips while paying the same fixed royalty as everyone else to ARM, this center of power seems poised to shift east.

In the long run, it's a highly realistic possibility that a company like Apple may end up just buying that patent license (as Intel did from AMD), or possibly Intel itself. It depends where things go, and how things become for Intel. Apple has the cold, liquid reserve capital to buy them right now, outright, with over $50bn left over. :eek:

Yet, as scary as that is, I am far more terrified that it might not be Apple, and when I think about it...I realize my sinking belief that it won't be.
 
Definitely. They've already gone 64-bit based ARM. I'm not sure how much that is going to matter. RISC is proving to be nearly twice as efficient as CISC on a per-wattage basis running even real-world applications. There's more than one way to skin a cat.

Right now, obviously, this won't cripple Intel as a practical market impact. The overwhelming majority of laptop and desktop computers sold across the world are not Apple. But that picture gets more interesting just as it does with cell phones when you throw out all the Celeron/Pentium/i3/i5 models, and just look at the i7's, particularly in the mobile space: like comparing only the more apples-to-apples (pardon the pun) Samsung flagship phones (Note/Galaxy S) vs. iPhone sales. If Apple can reign supreme in that space they rule the world, I'm figuring, just as Intel has the past generation when it separated from AMD.

Ultimately, this isn't just about Apple, obviously, but the whole evolving ARM vs. x86 battle you espied. Apple is just the leader, right now, but they definitely don't move the majority of the world's devices. Once the Android chipsets also catch up, made by those cheap Chinese manufacturers, some of whom are even designing their own ARM chips while paying the same fixed royalty as everyone else to ARM, this center of power seems poised to shift east.

In the long run, it's a highly realistic possibility that a company like Apple may end up just buying that patent license (as Intel did from AMD), or possibly Intel itself. It depends where things go, and how things become for Intel. Apple has the cold, liquid reserve capital to buy them right now, outright, with over $50bn left over. :eek:

Yet, as scary as that is, I am far more terrified that it might not be Apple, and when I think about it...I realize my sinking belief that it won't be.

I don't keep up on ARM hardware at all, but aren't they still around Core 2 duo level performance?

It definitely won't cripple Intel, but it will hurt them. With Intel's past couple of launch failures, AMD chipping away, and the Spectre/Meltdown issues they don't need to be losing anymore business. There's going to be a lot of data centers looking towards AMD when it comes to upgrade hardware.
I don't see Intel selling that patent anytime soon, I could see them buying AMD though. That would help them enter the console market that they've been rumored to be getting into.

On the i7-8700 above, I'd still go with an aftermarket cooler. That stock cooler will be at full speed and sound like a jet when the processor maxes out.
 
Last edited:
I don't keep up on ARM hardware at all, but aren't they still around Core 2 duo level performance?

It definitely won't cripple Intel, but it will hurt them. With Intel's past couple of launch failures, AMD chipping away, and the Spectre/Meltdown issues they don't need to be losing anymore business. There's going to be a lot of data centers looking towards AMD when it comes to upgrade hardware.
I don't see Intel selling that patent anytime soon, I could see them buying AMD though. That would help them enter the console market that they've been rumored to be getting into.
Certainly. Intel would never willingly sell that patent any more than they would willing sell their company to someone else and give up control or internal profit sharing distribution. Not right now. They would have to be brought low, first.

Per CPU performance, it depends on whether or not you're talking about desktop or mobile CPUs. Versus desktop it doesn't really matter that much: a world of dying relevance.

Versus mobile, it's still a bit apples-and-oranges to compare, since all the ARM APUs in phones and tablets give up quite a bit there for better graphical performance from the built-in APU, and in this way the mainstream mobile world is actually far more inclusive of the "gamer" culture on a hardware level than the desktop world. The balance between CPU and GPU is much more heavily tipped towards the latter than with Intel processors. Within the context of market dominance, gamers were abruptly abandoned by desktop AIOs upon arrival of the Core 2 Duo series and the Intel iGPUs a decade ago.

To give us a cross-platform comparison that is a bit more CPU intensive I'll look to Geekbench 4, and keep in mind this suite emphasizes performance across the portion of the APU-- the CPU-- where Intel holds relatively better against the onslaught of the mobile APUs. The A11 actually outscores the new Raven Ridge APUs because of this.

The Apple A11 Bionic is already a 7nm chip, employed in the iPhone X and iPhone 8 models, and here are their multicore scores juxtaposed to the most recent line of Intel-powered Macs. Also, keep in mind that while the A11 is technically a 6-core chip, it runs in a 2+4 big.LITTLE configuration, so it's spiritually much closer to dual core, and like the GPU, also does better relatively in single core scores:

Geekbench 4
  • 19,373 = iMac, 27" Mid-2017, (Intel i7-7700K @4.2GHz, 4 cores)
  • 17,775 = iMac, 21.5" Mid-2017 (Intel i7-7700 @3.6GHz, 4 cores)
  • 15,284 = Macbook Pro 15" Mid-2017 (i7-7820HQ @2.9GHz, 4 cores)*
  • 15,240 = iMac, 27" Mid-2017 (Intel i5-7600K @3.8GHz, 4 cores)
  • 14,875 = iMac, 27" Mid-2017 (Intel i5-7600 @3.5Ghz, 4 cores)
  • 14,410 = Macbook Pro, 15" Mid-2017 (Intel i7-7700HQ @2.8GHz, 4 cores)
  • 13,830 = iMac, 27" Mid-2017 (Intel i5-7500 @3.4Ghz, 4 cores)
  • 12,792 = iMac, 21.5" Mid-2017 (Intel i5-7400 @3.0GHz, 4 cores)
  • 10,186 = iPhone 8 Plus (Apple A11 @2.4GHz, 6 cores)
  • 10,129 = iPhone 8 (Apple A11 @2.4GHz, 6 cores)
  • 10,123 = iPhone X (Apple A11 @2.4GHz, 6 cores)
  • 9,571 = Macbook Pro, 13" Mid-2017 (Intel i7-7567U @3.5GHz, 2 cores)
  • 9,167 = iMac, 21.5" Mid-2017 (Intel i5-7360U @2.3GHz, 2 cores)
*This is a laptop that starts at $2,800

You get the idea. Furthermore, this is just the Apple chip that goes into phones. Just wait until the "A11X" fusion chip comes out that is targeted for tablets. It won't gain much in CPU performance, but it will shit on the A11 in terms of GPU performance which is already neck and neck with the Intel UHD Graphics in the top Core i7 chips.
GFX Benchmark, 1080p Manhattan Offscreen
  • 511.6 fps = NVIDIA GTX 1080 Ti
  • 394.7 fps = NVIDIA GTX 1060 6GB
  • 363.6 fps = NVIDIA GTX 1060 3GB
  • 299.6 fps = NVIDIA GTX 1050 Ti
  • 238.1 fps = NVIDIA GTX 1050
  • 152.0 fps = NVIDIA GT 1030
  • 108.3 fps = iPad Pro 12.9 2nd Gen (Apple A10X)
  • 95.8 fps = iPad Pro 10.5 (Apple A10X)
  • 96.0 fps = i7-7567U (Intel UHD 650)
  • 84.7 fps = i7-8700K (Intel UHD 630)
  • 82.6 fps = Apple iPad Pro (Apple A9X)
  • 80.5 fps = i7-7700K (Intel UHD 630)
  • 75.2 fps = i3-8300 (Intel UHD 630)
  • 74.2 fps = iPhone 8 Plus (Apple A11)
  • 74.2 fps = i5-8600K (Intel UHD 630)
  • 74.1 fps = i7-7700 (Intel UHD 630)
  • 72.4 fps = i7-6700K (Intel HD 530)
  • 72.3 fps = iPhone 8 (Apple A11)
  • 64.2 fps = iPad 6th Gen (Apple A10)
  • 63.5 fps = iPhone X (Apple A11)
  • 55.3 fps = Apple iPad Pro 9.7 (Apple A9X)
It's legit gaming hardware that you can carry in your hand.[/spoiler]
 
Last edited:
Certainly. Intel would never willingly sell that patent any more than they would willing sell their company to someone else and give up control or internal profit sharing distribution. Not right now. They would have to be brought low, first.

Per CPU performance, it depends on whether or not you're talking about desktop or mobile CPUs. Versus desktop it doesn't really matter that much: a world of dying relevance.

Versus mobile, it's still a bit apples-and-oranges to compare, since all the ARM APUs in phones and tablets give up quite a bit there for better graphical performance from the built-in APU, and in this way the mainstream mobile world is actually far more inclusive of the "gamer" culture on a hardware level than the desktop world. The balance between CPU and GPU is much more heavily tipped towards the latter than with Intel processors. Within the context of market dominance, gamers were abruptly abandoned by desktop AIOs upon arrival of the Core 2 Duo series and the Intel iGPUs a decade ago.

To give us a cross-platform comparison that is a bit more CPU intensive I'll look to Geekbench 4, and keep in mind this suite emphasizes performance across the portion of the APU-- the CPU-- where Intel holds relatively better against the onslaught of the mobile APUs. The A11 actually outscores the new Raven Ridge APUs because of this.

The Apple A11 Bionic is already a 7nm chip, employed in the iPhone X and iPhone 8 models, and here are their multicore scores juxtaposed to the most recent line of Intel-powered Macs. Also, keep in mind that while the A11 is technically a 6-core chip, it runs in a 2+4 big.LITTLE configuration, so it's spiritually much closer to dual core, and like the GPU, also does better relatively in single core scores:
  • 19,373 = iMac, 27" Mid-2017, (Intel i7-7700K @4.2GHz, 4 cores)
  • 17,775 = iMac, 21.5" Mid-2017 (Intel i7-7700 @3.6GHz, 4 cores)
  • 15,284 = Macbook Pro 15" Mid-2017 (i7-7820HQ @2.9GHz, 4 cores)*
  • 15,240 = iMac, 27" Mid-2017 (Intel i5-7600K @3.8GHz, 4 cores)
  • 14,875 = iMac, 27" Mid-2017 (Intel i5-7600 @3.5Ghz, 4 cores)
  • 14,410 = Macbook Pro, 15" Mid-2017 (Intel i7-7700HQ @2.8GHz, 4 cores)
  • 13,830 = iMac, 27" Mid-2017 (Intel i5-7500 @3.4Ghz, 4 cores)
  • 12,792 = iMac, 21.5" Mid-2017 (Intel i5-7400 @3.0GHz, 4 cores)
  • 10,186 = iPhone 8 Plus (Apple A11 @2.4GHz, 6 cores)
  • 10,129 = iPhone 8 (Apple A11 @2.4GHz, 6 cores)
  • 10,123 = iPhone X (Apple A11 @2.4GHz, 6 cores)
  • 9,571 = Macbook Pro, 13" Mid-2017 (Intel i7-7567U @3.5GHz, 2 cores)
  • 9,167 = iMac, 21.5" Mid-2017 (Intel i5-7360U @2.3GHz, 2 cores)
*This is a laptop that starts at $2,800

You get the idea. Furthermore, this is just the Apple chip that goes into phones. Just wait until the "A11X" fusion chip comes out that is targeted for tablets. It won't gain much in CPU performance, but it will shit on the A11 in terms of GPU performance which is already shitting on the Intel graphics in the Core i7 chips.

It's legit gaming hardware that you can carry in your hand.

I didn’t know ARM had bridged that gap. That’s impressive.
With that knowledge, I change my view on them buying AMD.
Could they be trying to buy ARM?

There’s still the software compatibility issue that would have to be dealt with if they go with ARM. Apple has a crap ton of money but is their market share big enough to persuade the software developers to make an ARM compatible version of their product?
They could do a wrapper or whatever it was called when they switched from PowerPC but that would be a bandage and not a permanent solution.
 
I didn’t know ARM had bridged that gap. That’s impressive.
With that knowledge, I change my view on them buying AMD.
Could they be trying to buy ARM?
I'm sure they already have, if not publicly, but ARM isn't selling, so it's the same situation in that sense. ARM, though, unlike Intel, doesn't actually manufacture its own chips. They just license. So not sure how anxious Apple feels about not buying them; especially when them buying ARM would give them command over licensing fees to all their Android opponents, so I strongly doubt a slew of anti-monopoly laws or federal rulings would allow it.

I don't much about how corporate takeover works. All I know is that Google tells me Apple has a $286bn mountain of cash just sitting around. That means they could buy any company in the world, save for a few dozen-- private, public, or state-- without having to pay a single visit to a banker, figuratively speaking. Shit like that just blows my mind.
There’s still the software compatibility issue that would have to be dealt with if they go with ARM. Apple has a crap ton of money but is their market share big enough to persuade the software developers to make an ARM compatible version of their product?

They could do a wrapper or whatever it was called when they switched from PowerPC but that would be a bandage and not a permanent solution.
Android is the elephant in the room since the international markets outside the USA are rapidly gaining in wealth and influence, relatively, and Android ships so many more total devices worldwide already, just like Windows vs. MacOS machines, but right now Apple is still driving the mobile market's app development, top-down, and the PC/Console software is coming to them (like Fortnite).

Microsoft Office is still one of the backbones of Microsoft profit, but software like that is becoming increasingly irrelevant with the rise of Google Docs and other competing cloud-based productivity copycats. It's just sort of getting bypassed by virtue of the fact that everybody's businesses are populated by people who own iPhones and iPads and like to use those products, personally, for business. They've already dominated the office space.

I recall a friend of mine who worked for a cooking oil reclamation trucking company who has been on Android his whole life, but had to learn the iPad because his company was forced to buy iPads because the existing software they wanted to use from the desktop environment was only compatible with iOS, and so all the guys took iPads with them on the short hauls.
 
I'm flustered and need some help.

specs:
CPU:
Intel - Xeon E3-1230 3.2GHz Quad-Core Processor
CPU Cooler: Enermax - ETS-T40F-BK 76.0 CFM CPU Cooler
Motherboard: Asus - P8Z77-V LK ATX LGA1155 Motherboard
Memory: Kingston - HyperX Fury Blue 8GB (1 x 8GB) DDR3-1866 Memory
Memory: Kingston - HyperX Fury Blue 8GB (1 x 8GB) DDR3-1866 Memory
Storage: SanDisk - SSD PLUS 240GB 2.5" Solid State Drive
Storage: Western Digital - Caviar Green 3TB 3.5" 5400RPM Internal Hard Drive
Video Card: PNY - GeForce GTX 970 4GB Video Card
Case: Fractal Design - Define C with Window ATX Mid Tower Case
Power Supply: Corsair - CXM 650W 80+ Bronze Certified Semi-Modular ATX Power Supply
Operating System: Microsoft - Windows 10 Pro OEM 64-bit

My problem is I can't get video output. When I boot the system I can hear the windows login music and I can hear when I plug in/unplug USB devices. My audio from the onboard audio.
I've tested and verified the card to be working on 2 different computers, Z270 and H110. I even stress tested the card with FurMark for 30 minutes on those machines to make sure it was stable.
I've done all the normal first steps like trying it with only 1 ram stick, unhooking all nonessential cables on the mobo, tried different pci-e slots, tested my video cables, tested the monitors, ran the latest mobo bios and rolled it back, etc.
There's no bios updates for the card on PNY's website.
I've tried a different power supply, Seasonic 650w Gold, and nothing changed.
I verified the pci-e slot works on the Z77 mobo with a GTX 1070. I stressed tested it as well.
I swapped the processor out to an i3-2120 and when I did that, I could boot into the bios and Windows when connected to the onboard gpu. I tried disabling the onboard video but I was greeted with a black screen. I could hear Windows log in the plugging/unplugging of USB devices.



Here's the parts that are confusing me
When I try the card in my other computers, I get a splash screen right away when I power the computer on that says "PNY GTX970 4GB" and some other stuff. I don't get that splash screen on the build I'm trying to put together.

As a hail mary, I put an AMD HD6450 in the pci-e slot below the 970, leaving the 970 in and hooked up. The fucking thing boots up just fine now, including the video card splash screen, and runs without a hiccup. I don't have a video cable hooked to the HD6450.
I decided to test this even further. I pulled out the HD6450 and put in an RX460 without the external power or a video cable connected. The 970 was still occupying the top slot with power connected. It boots into Windows and runs just fine. I get the same results if I use a GTX 1070 instead of the RX460.
The 970 shows up in the device manager without any issues.
I ran DDU to remove all the drivers and shut down the computer. I then pulled the HD6450 but that puts me back at square one.



Sorry if I'm all over the place. The damn thing works, but not like how it's supposed to. I see no reason why it I can't get it to work without the additional video card.
I feel like I've tested all the stuff within my capabilities, but there might be something simple I'm missing.

Anyone have any ideas?
@Brampton_Boy Do xeons have some weird shit I don't know about?
tenor.gif
 
No fix for you Jeff, but in looking over all the best 4K TVs out there for gaming in another thread, I noticed something curious.

The Acer Predator X27 was debuted at Computex 2017 in April and promised to hit the market shortly thereafter:
https://www.acer.com/ac/en/GB/content/predator-series/predatorx27

Predator-X27-1.jpg


So where is it? This would still be the most advanced gaming display in the world if it hit the market: 4K, IPS, 144Hz, HDR, 4ms Response Time, Quantum Dot technology, and G-Sync all in the same package. It's the dream monitor for gamers.

It's just that it's never actually gotten to market: a full year on. So far, it's vaporware.
 
No fix for you Jeff, but in looking over all the best 4K TVs out there for gaming in another thread, I noticed something curious.

The Acer Predator X27 was debuted at Computex 2017 in April and promised to hit the market shortly thereafter:
https://www.acer.com/ac/en/GB/content/predator-series/predatorx27

Predator-X27-1.jpg


So where is it? This would still be the most advanced gaming display in the world if it hit the market: 4K, IPS, 144Hz, HDR, 4ms Response Time, Quantum Dot technology, and G-Sync all in the same package. It's the dream monitor for gamers.

It's just that it's never actually gotten to market: a full year on. So far, it's vaporware.

I’m still not sold on the Tobeii eye tracking stuff.
Isn’t LG supposed to be releasing a monitor with those same specs in the near future?

I figured out my issue. I had to change a setting, pci rom priority, for my pci-e from uefi to legacy.
 
Bit shit prices tanking and are below profitability apart from from extreme cases of free electricity.

Bitcoin prices will settle at 2% above price of production at the cheapest producer (massive Chinese farms next to hydro dams).
 
Back
Top