Tumgik
#AND its not 8 years old + the 3 fans and gpu fan and cpu fan. surely thats enough. the case even has space for more than that!!
dexaroth · 9 months
Text
i cant believe the day but i finally got a full tower pc. bought it already built and at a considerable discount of some 320 dollars off. its fucking huge and theres so many things going on inside... i was initially planning on choosing the parts myself but finding the graphics card was so hard and everyone else convinced me to just buy it built and honestly? good. id probably have fucked this up so badly by myself
i cant use it yet bc i took too long to buy the monitor that was also on sale and now its regular price -_- tho i managed to find a discount used one for now. well see how that goes since ill get it tomorrow. i tested it on out living room tv and it had some kaspersky thingy open and like thats so cute. i hope they left some treats in the browsing history for me to search through before i wipe it clean
#its a hexer case and wouldnt you guess the front has a hexagonal pattern. so pretty..#it came with 3 fans installed there too that have a cmyk color style to them and it looks quite neat. im thinking of buying some leds to pu#inside the case to go with my keyboard tho idk if id go that far tbh (< gamer rot is setting in. im not immune to pretty lighting..)#its also got a lot of unused space inside. im thinking of making more sculptures to put in. though idk if thatd be safe for it#bc cold porcelain is glue and water. what if it evaporates inside and suddenly everythings covered in a glue film#i wonder if varnish would help? the transparent nail polish sure didnt do shit it came off like 2 days after sculpting the rw slug sleeping#which like yeah of course. its nail polish. but i didnt expect it to flake since all it does is sleep on top of my laptop keyboard#i need miniature glass cake cover tops to encapsule every sculpture inside for safety#looking at it still no wonder these are called towers gotdamn its legit so huge..#it looks awkward tho bc i cant fully make it glue to the wall bc of the cables so its like. awkwardly a bit in front of the wall#im scaared as to how to tell if it ever gets too hot. on a laptop u just press ur head against the left half and feel how hot it is#i think im gonna need software for this.. sigh. tho maybe ill never get to that point since its supposed to be decent#AND its not 8 years old + the 3 fans and gpu fan and cpu fan. surely thats enough. the case even has space for more than that!!#the acrylic side reflects my keyboard too. so niceys. stimulation for my creature eyes#my desk is gonna be so fucked up when i have to organize everything too bc the one i have now is perfecly laptop-oriented#it sits on a custom wooden desk and the keyboard+drawing tablet sit below. but theres a shelf on top of my desk thats too low for the>#>normal monitor to sit to so i wont be able to use the custom desk. and i dont even know what ill do with my laptop either#finally a good change in my sad life routine fr. i cant wait to play watchdogs on this and overgrowth and other ones#AND LAGLESS KRITA SMUDGE ENGINE BRUSHES!!! AND DOUBLE BRUSHES. THEYRE SO LAGGY#A N D ACTUAL FULL HD NORMAL MONITOR. maybe that will get me to not draw in small canvases anymore#now im anxious i just want the day to be over to get the monitor tomorrow aouugh.. just bc i started coding my resources neocities page#dextxt#<the 'major life events' ((sorta)) tag returns. one for the books.. if something bad happens.. itll be here to remind me of the good times
2 notes · View notes
aniket1122-blog · 4 years
Text
Best gaming laptops in 2020
 Best gaming laptops in 2020The gaming industry is increasing day by day and has a lots of scope in the future. Many pro gamers make a handsome amount of money through gaming. And if you too want to step in the gaming industry then you must read this article.If you think you can be a pro gamer using an old PC which is thick like a suitcase then you are probably having a misconception. Nowadays there are a lots of laptops in the market that have raised the bars high of laptops. There are laptops as powerful as the
Asus ROG Zephyrus
and the
HP Pavilion
. If you are rich and have enough money then the
Dell alienware
is the best gaming laptop now.Nowadays you get good gaming laptops with Nvidia GeForce graphic processors which give an out of world experience.
Now before starting with the list I would like to tell you that I have done a lot of research to make this list. So please share this with your friends and drop down your feedback in the comments.
The best gaming laptop you can buy now
1. Asus ROG Zephyrus
There are a lots of laptop out there but I showcase the best gaming laptop here. Starting off with the list on number one we have the Asus ROG Zephyrus. Asus has consistently improved itself and now it is emerging as on of the best gaming laptop makers. It has a 9th generation Intel core i7-9750h processor which makes it an absolute beast .
You can check out th eful list of the features here:
Processor: 9th Gen Intel Core i7-9750H Processor base speed 2.6GHz (12M Cache, up to 4.5GHz, 6 Cores)
Memory & Storage: 16GB DDR4 2666MHz RAM upgradeable up to 32GB RAM with | Storage: PCIe NVMe 512GB M.2 SSD with additional 1x M.2 Slot for PCIe SSD expansion.
Graphics: NVIDIA GeForce GTX 1660 Ti GDDR6 6GB VRAM
Display: 15.6-inch Full HD (1920x1080) Anti glare IPS-level panel, Refersh Rate 144Hz, 3ms, 100% sRGB, Pantone Validated.
Operating System: Pre-loaded Windows 10 Home with lifetime validity
Design & battery: 18.9mm Thin | Magnesium-Alloy Chassis | Super-narrow bezels frame | Laptop weight 1.93kg | Lithium battery 4-Cell | Battery Life upto 8 hours*
Cooling System: Anti-Dust Self-Cleaning Tunnels | 83 Fans Blades | 0.1mm Ultraslim Fins| 5 Pipes CPU/GPU/VRM..
2.HP Pavilion gaming
HP has a good background in making laptops. It has always satisfied it's customers with the quality of it's products. The HP Pavilion is yet another good gaming laptop from HP. It is an affordable laptop and has good features in this price range. Though it has an i5 processor it is good for starting with. It is one of the best and affordable gaming laptop in this range.Here goes the features list:
Processor: 9th Gen Intel Core i5-9300H processor(2.40 GHz base processor speed, 8 MB cache, 4 cores), Max Boost Clock Up to 4.10 Ghz
Operating System: Pre-loaded Windows 10 Home with lifetime validity
Display: 15.6-inch Full HD (1920 x 1080) SVA anti-glare WLED-backlit display
Memory & Storage: 8GB (1x8GB) DDR4 RAM | Storage:1TB HDD Hybrid Storage | M.2 Slot available
Graphics: NVIDIA GeForce GTX 1050 4GB GDDR5 Dedicated Graphics
Design & Battery: Laptop weight: 2.17 kg | Lithium battery
Warranty: This genuine HP laptop comes with 1 year domestic warranty from HP covering manufacturing defects and not covering physical damage.
3. Lenovo Legion Y540
Lenovo, a trustable and good company. It has been consistently giving good laptops and always strives for betterment. The Legion Y540 is another satisfying gaming laptop. It has aa 9th gen i5 processor which is pretty fast and gives you a good experience while gaming.The features list is a bit long for all the laptops but you know that bigger is better.The features of this laptop are:
Processor: 9th Generation Core Intel I5-9300H, 2.4 Ghz base speed, 4.1 Ghz max speed, 4 Cores, 8Mb Smart Cache
Operating System: Preloaded Windows 10 Home, with lifetime validity
Display: 15.6-inch screen with (1920X1080) full HD display | Anti Glare technology | IPS display | 250Nits | 60Hz Refresh Rate
Memory and Storage: 8 GB RAM | Storage 1TB SSD
Design and Battery: Laptop weight: 2.3Kg | Thin bezeled gaming laptop | Battery Life: 5 hrs
Warranty: This genuine Lenovo laptop comes with 1 year domestic warranty from Lenovo covering manufacturing defects and not covering physical damage.
Pre-Installed Software: Windows 10 Home, Office Home and Student 2019 | Inside the box: Laptop, Charger, User Manual
4. Dell Gaming-G3
The Dell Gaming is a masterpiece from Dell. One of the most trusted brands for good gaming laptops. You can buy this laptop with your eyes closed. It has a 4GB Nvidia 1650 graphic card and comes with Windows 10. You can use it to play any game you want and get a smooth and lag free experience.Check out its cool features:
2.6GHz Intel Core i7-9750H 9th Gen processor
8GB DDR4 RAM
1TB 5400rpm hard drive
15.6-inch screen, NVIDIA 1650 4GB Graphics
Windows 10 operating system
10 hours battery life, 2.5kg laptop
1 year warranty
                              5. Acer Nitro 7
Acer has a very good experience in making good gaming laptops. It also has a very good history in making one of the best gaming laptops. The Acer Nitro 7 is a thin and light gaming laptop. It comes with 4GB of 1650 graphic card and windows 10. Good for all gamer.Don't forget to check out the features:
2.40 GhzGHz Intel Core i5-9300H processor 9th Gen processor
8GB DDR4 RAM
1TB SSD
15.6-inch screen, NVIDIA GeForce GTX 1650 with 4 GB of dedicated GDDR5 VRAM 4GB Graphics
Windows 10 Home 64 bit operating system
7 hours battery life, 2.5kg laptop
Processor: Intel Core i5-9300H processor, turbo up to 4.10 Ghz | Display: 15.6" display with IPS (In-Plane Switching) technology, Full HD 1920 x1080, high-brightness (300 nits) Led Backlit
Memory: 8 GB of DDR4 system memory, upgradable to 32 GB using two soDIMM modules | Storage: 1 TB SSD
Graphics: NVIDIA GeForce GTX 1650 with 4 GB of dedicated GDDR5 VRAM | Pre-installed: Windows 10 Home (64 bit)
Warranty: One-year International Travelers Warranty (ITW)
                Bonus for you all
This bonus laptop is best for gamers but you need a high budget. I am talking about the Alienware manufactured by Dell. It is a dream laptop for every gamer. It comes with an 8GB Nvidia 2080 graphics.Features:
2.4GHz Intel Core i9-9980HK 9th Gen processor
16GB DDR4 RAM
5400rpm hard drive
15.6-inch screen, NVIDIA 2080 8GB Graphics
Windows 10 Home operating system
10 hours battery life, 2.16kg laptop
Processor detail
15.6 inch, Windows 10 Home
1 year manufacturer warranty
Good bye and make sure you share this post and comment down.
2 notes · View notes
dailytechnologynews · 5 years
Photo
Tumblr media
6 Years Later - Reviewing My Build's Successes and Failures At 30,000 Hours of Uptime
So here it is, folks. The 4 grand, big brand, first hand grandstand on an often-neglected topic – what to expect of your machine after years of hard use. Listed below is the encased case I'll be studying for this piece. Components that look like this are in the machine currently. Components that look like this were replaced with upgrades. Components that look like this failed in service.
Type Item Price CPU Intel - Core i5-4670K 3.4 GHz Quad-Core Processor $230 CPU Cooler Noctua - NH-D14 64.95 CFM CPU Cooler $74.95 @ Amazon Motherboard Asus - Z87-Pro ATX LGA1150 Motherboard ~$180 Memory Mushkin - Redline 8 GB (2 x 4 GB) DDR3-1866 Memory ~$100? Upgraded Memory El Cheapo Nemix 32 GB (4 x 8 GB) DDR3-1600 Memory $150.00 Storage Samsung - 840 Series 120 GB 2.5" Solid State Drive ~$120? Storage Seagate - Barracuda 1 TB 3.5" 7200RPM Internal Hard Drive $58.49 @ OutletPC Replacement Storage Western Digital - BLACK SERIES 2 TB 3.5" 7200RPM Internal Hard Drive $119.16 @ OutletPC Video Card Asus - GeForce GTX 780 3 GB DirectCU II Video Card $650 Upgraded Video Card EVGA GeForce GTX 1080 FTW GAMING ACX 3.0 ~$650 Replacement Video Card EVGA GeForce RTX 2070 XC ULTRA GAMING $0 Case Fractal Design - Define R4 (Black Pearl) ATX Mid Tower Case ~$100 Power Supply Fractal Design - Newton R3 600 W 80+ Platinum Certified Semi-Modular ATX Power Supply ~$100 Optical Drive Asus - DRW-24F1ST DVD/CD Writer $21.39 @ OutletPC Operating System Microsoft - Windows 8 OEM 64-bit ~$100 Operating System Microsoft - Windows 10 64-bit $0 Case Fan Fractal Design - FD-FAN-SSR2-140 66 CFM 140mm Fan $13.89 @ SuperBiiz Monitor Asus - VG248QE 24.0" 1920x1080 144 Hz Monitor $246.00 @ Amazon Monitor Asus - VG248QE 24.0" 1920x1080 144 Hz Monitor $246.00 @ Amazon Upgraded Monitor LG - 34UM95 34.0" 3440x1440 60 Hz Monitor $750.00 Total ~$4,000 Generated by PCPartPicker
As you can tell I had two failures, both of them pretty major. I'll cover them a little farther down.
Starting off, these are the goals I had in mind when building this machine: First and foremost, I wanted the best performance in flight simulators and CAD/CAM software that I could justify spending for. I wanted perfect snappiness in Windows, MS Office, and web browsers. Second, I wanted longevity. Third, silence. I'd say this build achieved all of those things... but I have a few warnings for people looking to build a rig with a similar mindset.
I had to make multiple upgrades to the machine for it to keep up with the expanding RAM, VRAM, and storage requirements as sims like DCS got extra content and released updates with power-hungry graphics improvements. Also I may have purchased a much larger monitor and a VR headset... sorry 780.
If you have to skimp on things, don't skimp on the CPU, motherboard, or PSU. Although I have had the urge to get an M.2 SSD and upgrade my CPU for some time now (although really it's still keeping up perfectly fine), the fact that my current motherboard and RAM will also need to be replaced makes that unjustifiable. At this moment, for me to upgrade to a i7-8700k and an M.2 without losing RAM would cost about $1,200. Totally out of the ballpark.
Expect to have failures and do maintenance. I was lucky and had no DOA parts in the build, and the thing ran absolutely flawlessly for years. However about 5 years into the life of the machine, the 1TB storage drive suffered a soft failure. I noticed obvious performance issues, and with drive health monitoring software open I watched it slowly die as I attempted to transfer all the files I wanted elsewhere. I got everything important, but shit. You know the saying that while SSDs have a built-in service life, HDDs either fail within the first couple years or last until obsolescence? Ahhh... not in my experience. Anything I build from now on will probably be all-SSD.
(3 cont'd) As for the 1080 that died, that was much more dramatic. I'm flying along in the sublime DCS F/A-18C recreating Mongo's MiG-21 shootdown in the Gulf War when all at once the computer instantly powers off with a pop and the screen goes black. I'm thinking "...power outage?" until I smell it – something let the smoke out. After a postmortem I decide the smell had to have come from the GPU. So I throw in the old 780 and it boots up – but no video output. Shit. Video output from the IGPU works fine though? Huh. So I try a different PCIe slot and what do you know... I'm pretty sure my 1080 fried the only 16x slot on the board. Not too big a deal to run on 8x but now I feel the machine is in its twilight years with one of the newest components in the rig failing so spectacularly and running with a damaged motherboard. Being realistic though, I won't be at all surprised if this thing will keep going another 6 years or more with an SSD change.
Warranties matter as much or more than quality. At first, I went all-in on the highest quality parts I could get without paying any attention to the warranty service. To this day I still consider the Asus 780 DCUII an incredibly well-built card. When I retrieved it to replace the blown-up 1080 I was impressed all over again with how sturdy it felt and just the quality of work Asus put into it. But all cards can fail, and if the same thing that happened to my EVGA 1080 had happened to my Asus 780... well, I'd have been shit out of both luck and $650. As it stands I'm actually getting an upgrade out of this catastrophe (albeit still being left with a dead PCIe slot).
Don't bother with watercooling, not even AIOs except in very specific use cases. It's not anywhere close to being worth the headache for the vast majority of people going that route. The amount of additional maintenance and attention required to keep a watercooled rig going strong for so many years is way more than you're going to want to do. I know you're pretty into the hardware side of your computer now, but just trust me. You're going to be a substantially different person in 5 years, most likely one that wants a machine that just works without any doubts about water leaks, water line contamination, pumps dying, etc.
Shit's expensive, yo. Yes I know I didn't do my wallet any favors here, but just be aware that if you want to maintain a top-shelf rig for many years to come, get ready to shell out many thousands too. It's not a one-and-done purchase, even if you can handle falling behind the state of the art. I didn't even list all of my peripherals here. In addition to all of this I've also got a UPS, a Das Keyboard 4, monitor stand for the 34UM95 and an Ergotron arm mount for the VG248QEs, flight sim peripherals, headphones, DAC, and more. Plus power bills I've honestly got no clue how much this thing has cost me in total. At least $5.5k. Was it worth it? Oh fuck yeah it was worth it. But I'm not exactly on a tight budget here... don't stretch yourself for something that is ultimately probably going to serve as much as a distraction from responsibility as it will a tool for bettering your life. It undeniably is the latter... but you don't need to spend nearly as much if you just want a productivity machine.
What would I have done differently with the initial build? Probably nothing. I probably should have gone all-SSD a year or two ago but that's fine. In the near future I'll just replace the OS drive and add a storage SSD. My machine has been an absolute pleasure to own, a dream come true after years of the shitty family computer (even by 90s standards) and countless craptops. If you have the means, I highly recommend picking one up.
4 notes · View notes
Text
Radeon RX 580: Review 2021 | Testing| Specs | Profit ( Good and Bad Side)
Tumblr media
Radeon RX 580 Cryptocurrency Mining: Review 2021 | Testing: Specs | Profit | Hashrate | Settings | Setup| Configuration | CPU Performance: Rx 580 mining - Check out the specification, hashrate, profitability and the payback period of this miner as well as other critical information before buying. After the release of the Radeon RX 580 video card, the entire five hundredth series became one of the most profitable options, not only for gamers, but also for mining. Cards bought up immediately after their appearance in the store and the resulting shortage significantly influenced their price.  The cost of the RX 580 even from the “cheapest” vendors has grown by at least 30-40%. This greatly increased the payback period of the video card and shaken the leading position of the video card. Please note- The Review was done earlier than 2021 so their will be variation inters of the rate would earn when currency exchange is taken into consideration. Is the RX 580 the best and most promising choice for miners in 2018, and is it worth paying attention to this video card today when building a farm from scratch? In this article we will look at all the features, calculations and potential of the card for mining today and the future. People Also Ask: Common Questions Radeon RX 580 Specifications and power Radeon RX 580 First, let's briefly review the technical characteristics of the card, which will help to understand its relevance for mining and a place in the top. There are two versions of the RX 580, 4 and 8 gigabytes of memory. For the rest of the characteristics, there are almost no differences between the cards, and the relevance of the “extra” 4 GB of memory will be considered in the following blocks. - Graphic Engine: AMD Radeon RX 580 - Bus Standard: PCI Express 3.0 - Video Memory: 8GB GDDR5 - Engine Clock:  - 1380 MHz (OC Mode) - 1360 MHz (Gaming Mode) - Stream Processors: 2304 - Memory Speed: 8 Gbps - Memory Interface: 192-bit - Resolution: Digital Max RTop 10 Questions Asked on Amazon Before Buying - Geforce GTX 1060esolution:7680x4320 - Interface: - 1 x DVI-D - 2 x HDMI 2.0b - 2 x DisplayPort 1.4 - HDCP Support: Yes - Maximum Display Support: 4 - Software:ASUS GPU Tweak II  - Dimensions: 9.53 " x 5.07 " x 1.49 " Inch - Recommended PSU: 500W - Power Connectors: 1 x 8-pin - Slot: 2 Slot If you take the most reference version of the RX 580 NITRO from Sapphire, then the characteristics will be as follows: - The core frequency is 1340 MHz for silent mode and 1411 MHz for maximum boost. - Memory - 8,192 GB with a 255-bit bus and a frequency of 2 MHz. - 225 watts peak power. However, this is the most powerful map, which is dispersed from the factory to the maximum. Solutions from other vendors will have lower frequencies, and not only their overclocking potential will be important here, but also some of the manipulations associated with the firmware. The cornerstone of mining is overclocking. In terms of importance, this criterion is second only to the price of the video card itself, which determines the payback and the ratio of income and investment. As is the case with other video cards, everything will depend on the memory manufacturer. In 2018, it was almost impossible to find even the top NITRO and Limited Edition solutions with Samsung memory. Most of the cards come with Hynix memory, whose overclocking potential is significantly inferior, which will certainly affect the overall profit of the farm in mining. This indicator is not critical, but in the case of a possible choice, you should always give preference to Samsung video cards. Top 10 Questions Asked on Amazon Before Buying - Radeon RX 580 XFX Radeon RX 580 GTS Edition 1386MHz OC+, 8GB GDDR5, VR Ready, Dual BIOS, 3xDP HDMI DVI, AMD Graphics Card (RX-580P8DFD6) Question 1: What would this equal to a GeForce card? Answer: Between 1060 and 1070 Question 2: Is this one could build in a mini case? Answer: No, it barely fit in my mid-size NZXT with no modification needed Question 3: Is it with upgrading my nvidia gtx 1050ti 4 gb gddr5 direct 12 graphics card you this one?? Answer: Yes, that's what i had before too gtx 1050ti replaced with this XFX RX 580 GTS Question 4: Will the Radeon RX 580 card work with an msi h110m atx gaming motherboard? Answer: Yes, all you need is a PCI-Express 3.0 x16 Slot, and be sure your PC case is big enough to support a GPU of this size. Question 5: Can I play rainbow six siege with 8gb or 4gb? Answer: That is something you need to find out. Look at the system requirements for the game. Easy Google search. Question 6: It shows that this gpu needs 500 watt. I have a sonnet egfx breakaway box 550w with 80% efficient. it means it only have 440w. so can i use this gpu? Answer: Yes. ( Customer Answered: I have a 750W PSU, the card causes random reboots, it is a terrible design that overloads the 8-pin rail. Check the internet for issues with this particular manufacturer.) Question 7: Does this card come with free games? Answer: It did, when purchased it Question 8: Hey I am just worried , so I ordered nzxt mid tower and will this video card fit in?? Answer: It, Should. Question 9: The Radeon RX 580, is this card compatible with a dell xps 8700? Answer: Sure it does, as long as you have a case that will fit the card. Question 10: I have a ryzen 5 2400g will the rx 580 8gb be better than the gtx 1060 ti 3gb or 6gb for gaming? Answer: I average 120 - 150 fps in rainbow 6 siege if that helps? that on 144hz monitor: ( Customer had this to says as well - The 580 would be a better match because it is around the gtx 1070 in performance.) Best Review Posted on Amazon Before Buying - Radeon RX 580 Best Review Posted on Amazon Before Buying - Radeon RX 580 Customer Review 1 of Radeon RX 580: This card can handle games(destiny, fortnite, Pubg...etc) at high FPS with no issues. The card does use more power than an nvidia card with similar specs and also creates more heat. But at the same time value for the spec is great. As a product I would give it 4 stars however, XFX's warranty service is surprisingly easy and fast. My card broke after a year of use, so I registered the product on their site, I received a response within 24 hours. They troubleshooted and determined the card needs to be RMAed, I sent the card back and I received a new one within a week after they received the defective card. They didn't even require the receipt(required by most companies) even though I had it. It's amazing service compare to my PNY nvidia experience which was like pulling teeth. I will definitely buy more XFX products in the future. Customer Review 2 of Radeon RX 580: Used to be a big Nvidia fan. Then, I started encountering problems with their drivers. Fine, rolled back my driver to an older version. Then, my GTX 970 stopped working after only 1 year of use. Fine, sent it in and received a refurbished one. The refurbished one now has the same issue. Bought the XFX GTS RX 580 8GB DDR5. 0 driver issues. Card still works after a few months and puts out a better picture then my 970. I'll update my review if something goes wrong but, for now, I'm extremely pleased with this card both in terms of performance and price. I'll be all to happy to continue buying AMD in future. Customer Review 3 of Radeon RX 580: Shipping arrive on time and before I got home, which is a first. A few things to note: 1: This is a fairly large GPU, if you don't have a Full Tower, or an opened-air mobo, your mileage with fitting this thing in a Mid Tower WILL differ 2: My old GPU require two 6-pin, while this GPU required one 8-pin, you will either need an 8-pin or a 6+2 pin to connect your power supply to your GPU, if you don't have a powersupply that you modify, well you will need a new one. 3: For the price, this GPU is very strong, even if it is only a rehash of the 480... and being a year old. It can still run newer titles like Monster Hunter: World without too many problems. 4: Price-wise, having to compete with cryto-miners may increase the price of this GPU from time to time... very annoying. 5: This is a major update from my old HD 7950 (which still works for many games that are moderately intense and are new.) Worst Review Posted on Amazon Before Buying - Radeon RX 580 Bad Customer Review of the Radeon RX 580 on Amazon: After about 2 months of having this card, it had fried itself. I was playing fallout when my computer crashed and upon trying to start it up, my USB keyboard and mouse would get power for their LED's but the computer would not start, no fans would spin. A few days later I tried it again to which sparks shot out of my video card. It would have likely fried my whole computer if I hadn't unplugged my power strip from the wall. I later tried another video card that a friend gave me to which I downloaded the drivers for and it worked pretty much fine. Very disappointed in this product as it nearly ruined my entire computer. Bad Customer Review 2 of the Radeon RX 580 on Amazon: When playing intensive full screen games or benchmarks two of the three display ports have issues with intermittent black screens. Confirmed on TWO copies of this card (I bought one, it started doing it, exchanged the card for another, same exact issues). Have spent a decent amount of time doing all the normal troubleshooting (fresh install of Windows, clean and re-installation of AMD drivers, manually playing with the voltage and frequency of the card, etc). Please note: In spite of the bad review by a few customers, we experience no such issues (not saying you won't) but We would comfortably recommend this product for mining cryptocurrency based on our testing that will be continue below. How to increase the potential of the Radeon RX 580 in mining Even in the case of good versions of the RX 580 (Pulse and others), the potential of the card out of the box can hardly be called incredible. If you look at the average power of the video card, excluding the top solutions with good factory overclocking, you can count on the following indicators: - Equihash - 302-310 SOL / s. - X11gost - 8.4-8.6 MH / s. - Daggerhashimoto - 26.1-26.8 MH / s. - Pascal - 0.85 GH / s. The best solution for current maps from Radeon is Ethereum mining. It is during its mining that the maximum potential of the cards is revealed, therefore any trusses with the RX 500 series are usually collected for this critical currency. Mine others is impractical, it is a direct loss of profit. Initially, most of the 580s give from 18 to 22.5 MH / s without overclocking. Against the background of the possible 30-31 MH / s, which are sought by the owners of Radeon farms with the older model of the 500th series, the card mining capacity in the factory settings is not too high. However, a prerequisite for obtaining a better result is the BIOS firmware flashing. This can be called one of the conditional minuses of AMD cards, due to which some miners choose NVidia. However, with a successful acceleration and reduction of energy consumption, the yield of the RX 580 will be quite high, especially with price increases and the overall attractiveness of Ethereum. Therefore, in the absence of sufficient knowledge, it is better to entrust the firmware to specialists. They will make it quickly and without risk to get the so-called “brick”, that is, a “dead” card without the possibility of returning it under warranty. After flashing, the average potential of video cards will increase from 18-22 MH / s to minimum 26.5 MH / s. And in the case of good memory and successful overclocking, it is quite possible to get 28-30 MH / s, in which the RX 580 becomes one of the best in terms of return on video cards. On average, experts recommend focusing on 10-15% acceleration on average. This is the best indicator in terms of temperature, energy consumption and output power in the production of cryptocurrency. Choosing the best OS - Radeon Rx 580 The correct choice of operating system can significantly simplify work with the farm and even reduce costs. In the confrontation of Windows and Linux for farms on the RX 580 usually choose the latest OS. Hive OS has several important advantages over Windows, the most significant of which are: - There is no limit on 8 video cards. - No need to face the difficult process of choosing the right drivers. - The OS was originally developed and adapted for mining. - Ability to work without monitors (using emulators). - No need to have an SSD or HDD, a regular 8-16 GB flash drive is sufficient. - WatchDog is built into the system, you do not need to pay extra for it (as is the case with Windows). - Easy setup and functional remote monitoring. - No need to buy an expensive license. - Telegram notifications in real time. You also need to consider that in case of installation of pirated copies of Windows, the farm can be confiscated, as it is stipulated by law. With Linux, there will be no such problems, which is another important advantage. One of the few advantages of Windows, namely the ability to use a video card for tasks other than mining, is irrelevant, because 99% of the farms are originally created specifically for mining cryptocurrency. This is the main task. Farm payback with RX 580 One of the most important criteria when choosing a video card for mining is their payback. It depends on many conditions. Some of them have variable values, that is, they can constantly change. This concerns the recession or growth of the cryptocurrency rate. Taking into account the relevance of Ethereum, the total payback period for the RX 580 can be 7-9 months, in case of a good time (accumulation of currency on the wallet and sale after peak rises), and 12-15 months. This period is influenced by the following factors (in descending order from the most significant): - The course of the ether. - Firmware (that is, the disclosure of the potential of a particular card in terms of overclocking, power consumption, etc.). - The original cost of cards. - Total investment in the farm (any savings, for example, on the HDD, reduces costs and accelerates payback a little). Experienced miners recommend not to display a fixed broadcast systematically. This is best done only at times when there is a strong growth rate. In general, even under adverse conditions, the return on video card in 2018 is approximately 15-18 months. Given the current conditions for mining and farm profitability, this is not the worst time. The choice between the 4 GB and 8 GB versions for the Radeon RX 580 is almost always unambiguous. If for the GTX 1060 it is possible to consider the 3 GB version as very promising, then for AMD, on which usually the broadcast is mined, only the 8 GB version would be preferable. Of course, after the miner exceeds the allowable amount in the 4 GB version, you can switch to mining other cryptocurrencies, but this is not the best option. In such conditions, it will not be possible to extract the air on 4GB cards after February 6, 2021. Taking into account the not so big difference in prices between 4 and 8 GB, in most cases it is better to take a video card with a large amount of memory for the long term. As far as vendors are concerned, there is not much difference. Buying top solutions, for example, NITRO, is not always profitable. Despite the fact that they provide good cooling and increased power, their cost is much higher than that of the "regular" versions. It is much better to buy Asus Dual or other cards in the same price category. The difference between them, in comparison with the NITRO and LE versions, is about $ 150, but it is almost completely leveled by the correct firmware, pushing all the cards to approximately the same hashrate (+ - 3-4%). Conclusion Despite the fact that the Radeon RX 580 is no longer a top-notch solution, the positions of the video card are still preserved. This is one of the most sensible choices for mining in the medium and long term. With the right approach to buying cards for the farm and selling the naming ether, you can reduce the payback by almost half, making the RX 580 a leader in this criterion. The only significant drawback may be mandatory BIOS firmware. But this issue is easily solved with the help of specialists, because this deficiency can be considered conditional. HOW TO is the best main RX 580 graphics card... XFX Radeon RX 580 GTS Edition 1386MHz OC+, 8GB GDDR5, VR Ready, Dual BIOS, 3xDP HDMI DVI, AMD Graphics Card (RX-580P8DFD6) RX 580 8GB Test in 25 Games in 2020 https://www.youtube.com/watch?v=Je2BWKkkRK0 RX 580 series graphics card AMD and profitable mining - compatible? For lovers of cryptocurrency production, one performance at an exponential speed or another video card with a priority is increasingly an issue in our country. In addition to paying electricity bills and law enforcement officers, who are increasingly asking the president and the government to introduce cryptocurrency on the part of the state monopoly may not be legitimate. Although the mining system is systematically efficient in the process of revenge (this is the longer the time, the longer the "production" of a unit cryptocurrency), the Russian miners are still committed to the same, if in the yard of 2012. Performance RX 580 Series AMD's products are firmly in the leadership of mining fans for many years in a continuous circle. Although their main competitor, Nvidia, has significantly improved the compatibility of mining cards during the five-year period, the company's main support is still the gaming industry and other media. The RX series of 8Gb 580 is updated with the Radeon RX 570 and put into production in 2017 and has picked up a large number of manufacturers. The most popular versions are the models released by "Sapphire" and "EM-ES-AJ". The most popular model series - Features At the end of the summer of 2017, the main mode of this line proved the sapphire NITRO. In Western Europe, the player miner offended all the last tilted copies as the AMD Radeon RX 580, so this card snatched pereproyti "Witcher 3" from the store for the first time with a good frame rate per second. What can please our RX5808 g: -  a clock frequency of 1450 MHz for the possibility of diffusion; -  The impressive number of stream processors - 2304; -  GDDR5 memory is 8192 megaGB, with a frequency of 2000 MHz. In the numerous tests conducted by independent publications of computer hardware, we did not see the special adaptation of this graphics card mining. Many experts even suspect that the commercial success of the card may be caused by "popular rumors" than any actual benefit. When digging "Ether" (Vendetta, ETH for short), with the current version of the miner Claymore double ETH (firmware 9.2), our "Radeon" I give all 22, 5 MHS. It is interfered with the factory settings without a memory or mining calculator. Of course, the result is unqualified. Obviously, memory timing is a high clock frequency that may have to be kept visible by production attenuating. We don't even dare to imagine - it will be 4GB of RAM in this one more budget RX 580. And for the RX 580 Nitro and Western and Russian experts agree that the main news sapphire Radeon revenge enough speed. Perhaps the so-called sapphire pulse is expected to update the model to change the situation of the Meining Ç580. What is the competition? From a little more known to us, the quotation of the MSI RX 580 has appeared on our new version of the "Sapphire". Only when the temperature reaches 60 ° C ventilation system and armor technology in the performance of the company's lead blade cooler it can not be more than those who buy cards to revenge or not. Maintaining a good and strong air supply cycle keeps the investment in the field. In contrast, whether it is the "Sapphire" and "ASUS" versions, this card is able to switch memory modes. What, again, it is more important for gamers and office workers. Read the full article
0 notes
Text
Mac Mini For Photoshop
Tumblr media
The entry-level Mac mini offers a 3.6GHz quad-core i3 processor for £799/$799 which may not fulfil the needs of the typical designer, but the £1,099/$1,099 version offers a 3.0GHz 6-core i5. Apple FINALLY updated their Mac Mini product line late in 2018 (with a minor update in March 2020), making it a pretty good option for photographers to run Lightroom and Photoshop. This buying guide provides insight into which model and what configuration options photographers should consider. The best budget Mac for photo editing is the 2019 Mac Mini for about $900 and the best budget PC is the Dell Inspiron 3670 for about $650. Add the ViewSonic VA2719-2K-SMHD 27 Inch display for about $220 and a 4TB hard drive for another $100. So you are a photographer on a tight budget in need of a computer to run Lightroom and Photoshop. Hi John - The PA272W-BK-SV 27' 16:9 IPS Monitor with SpectraViewII from NEC comes equipped with the NEC SpectraViewII color calibration tool and features enhanced color accuracy covering 99.3% of the Adobe RGB color space, 94.8% of the NTSC color space, and 146.4% of the sRGB color space. With a variety of input connectors including DisplayPort, Mini DisplayPort, HDMI, and DVI-D Dual-Link, you.
Mac Mini For Photoshop Cs6
Mac Mini For Photoshop Software
What is Final Cut Pro X like running on a new Mac mini? We take a look at the new model, its features and how well FCPX performs. It even beats an iMac Pro in one of our tests!
Before we plug the new Mac mini up, it is important to understand that this version of the unit has changed. Changed a lot.
Back in 2005, the Mac mini was designed for switchers from PCs. It didn’t come with a screen, keyboard or mouse and keeping the price down helped make the transition to Mac OS 10.3 Panther and a PowerPC processor as painless as possible.
Times and technology have changed, no need now for the DVD slot or a spinning hard disk for storage.
The switchers of today are buying MacBook Airs and MacBook Pros as their first Macs, not the mini.
So this gives Apple a chance to change and retarget the use of the Mac mini. Consequently, that’s exactly what they have done with the new range of models.
But instead of offering (To quote Steve Jobs) a ‘stripped down Mac’ they’ve actually put the logic board on steroids!
Tumblr media
The fourth generation Mac mini now has a choice of quad and 6-core processors, up to 64GB of Ram, up to 2TB of SSD storage and the option of a 10GigE port over the standard GigE.
There are also four USB-C Thunderbolt 3 ports fed from two controllers, an HDMI port that supports 4K and two USB 3 ports. So yes, you can plug your own keyboard and mouse in without having to buy any adaptors.
Can the user upgrade the RAM in the new machine? Yes, it is possible, but it is not a case of flipping up a slot and exchanging the cards out.
You have to remove the cooling fan and then slide out the logic board. Make a mess of it and you’ll invalidate the warranty, so it is best to stick to an Apple approved centre for the upgrade. Should you want to get your spudger out and see the insides of the new Mac mini, head over to the excellent iFixit site.
The machine on test is a 3.2GHz 6‑core 8th‑generation Intel Core i7, 32GB 2666MHz DDR4, Intel UHD Graphics 630, 1TB SSD storage and the 10 Gigabit Ethernet option.
If the colour of the Mac mini looks familiar, it is exactly the same as the iMac Pro and (after seeing them side by side) the Blackmagic external GPU. This also makes sense of the decision for Apple to sell the black keyboard and mouse and also probably hints at the new Mac Pro colour. (And possible footprint- a skyscraper sized oblong trashcan?)
Connecting it up
I’m lucky to have 2 10GigE connections on the back of my QNAP NAS. Final Cut Pro X needs fast drives to be able to build the ‘always live’ waveforms and thumbnails.
Plumbing the Mac mini into the edit system is easy and I’ve detailed how to do point to point 10Gig connection before if you haven't got a router/switch.
If you need the internet on the machine, a cheap USB to Ethernet adaptor from Amazon for $15 works as well as anything else. Run a speed test to make sure you are accessing the NAS via the higher speed route.
I powered a 4K monitor from the HDMI port; this works well and avoids any more dongles having to be bought.
After 25 years of Mac ownership, I have enough keyboards and a spare mouse to finish off the system!
Power On
The machine comes with Mojave installed and defaults to dark mode on the now silent boot. The machine is very quiet and can’t be heard over the fan of the nearby QNAP, which isn’t that loud either. This machine could easily sit on your desktop and not annoy you or your co-worker.
I read a recent review that the sound was pretty terrible out of the Mac mini. I’d disagree. It isn’t great, but it isn’t bad either. I’d say it was better than the old cylinder Mac Pro, but not in the league of the Mac notebooks. You wouldn’t want to use it as edit monitoring.
The Mac mini does have a headphone socket and I can see many editors sitting in offices with the machine on the desk and headphones on editing all day.
Although I worked the machine hard with rendering, the shell didn't get too hot. However I did feel the warm rush of air out of the back of the machine which caught me by surprise when the mini was angled away from me.
I wouldn't have it in this orientation for continuous use, ie situated under a central monitor post with cables pointing at you. Instead I suggest putting the cables and heat exhaust to the back. It will make plugging the headphones in a bit trickier, but I hate sitting in draught!
Final Cut Pro X Performance!
Let’s give it a real test and put it up against an iMac Pro!
I thought I’d dive straight in with a 4K Project and use a Library with a short minute and a half sequence with large still images, transitions, multiple title and adjustment layers.
It is in a Library that I duplicated to both machines with the media staying on the QNAP. The cache was set to both desktops respectively as the RAM in both runs at up to 3Gig speed.
Mac Mini For Photoshop Cs6
Tumblr media Tumblr media
Although I didn’t time it, it felt like the thumbnails were taking a longer time to draw than I’m used to on the iMac Pro. All render files were deleted before all tests.
Unrendered, the Mac mini played the sequence back in Better Performance, but dropped frames on Better Quality. The iMac Pro played back in both settings without a problem.
Skimming seemed just as fast as the iMac Pro and it didn’t feel underpowered when navigating the timeline or browser.
The iMac is an 8 core machine with 64GB of RAM and more importantly a Radeon Pro Vega 64 GPU.
Both machines are connected to the same storage via 10GigE and renders and exports went to the respective desktops. (Ok I know it is NBase-T which supports 1Gb, 2.5Gb, 5Gb and 10Gb)
Tumblr media
Render time Mac mini 7’03” iMac Pro 1’43”
Mac Mini For Photoshop Software
4K ProRes 422 Export Mac mini 6’45 iMac Pro 1’40”
As expected here as FCPX uses the GPU for image processing, the much lower powered Mac mini takes a lot longer.
Compressor Convert to HEVC 4K 8Bit Mac mini 54” iMac Pro 55”
Well, the Mac mini beat the iMac Pro! This is because all of the conversion is done on the CPUs.
Not really worth testing for Motion as apart from a few things like particles, Motion almost lives on the GPU.
Compressor Clusters
No, not a new breakfast cereal, but Compressor allows you to share the work out over connected machines.
I didn’t realise that the ability to set up a cluster of machines running Compressor was easy to do. Well, when I say easy my first attempt failed, but that might be down to my slightly quirky network topology with the QNAP.
To build a cluster, on the machines you want to add, open up a copy of Compressor. In the preferences, turn the option on for other computers to process batches.
Then on the host machine, make a cluster from the available machines in the list. Here you can see we have got something very wrong! We will be revisiting this topic with the issue fixed when we have more time.
Then having named your cluster (or other single machine), you can then toggle the processing destination in the dropdown menu on the bottom of the host machine's Compressor GUI.
To work properly and fast, all the machines need to be connected with 10GigE via a 10GigE switch. The costs of this networking is now a lot cheaper, Netgear and QNAP make a suitable budget switch.
There is no limit to the amount of Mac minis you can have in a cluster. You'll probably run out of switch ports first!
One note here. For distributed processing, it has to be a self contained movie that gets automatically diced and sliced and sent off to the cluster machines.
Conclusions
Not what I expected. I guess I was in the state of mind thinking that the Mac mini wasn’t a serious machine for anything other than web browsing, Plex serving or basic Photoshop.
It’s a lot more than than. It is a component in building a modular system, which is a new thought considering that Apple has been criticised over the past few years for lack of upgradability in the Mac Pro and iMac Pro.
There are two reasons for this new direction. The first are Thunderbolt 3 connections giving the option of using an external GPU and therefore factoring out the limited onboard Intel offering.
The second is the option of a 10GigE port. Being able to connect to high speed shared storage without going through an adaptor is a huge plus.
Why? Take sever centres for example. Every App on the iOS App Store has been compiled on a Mac. Rack up rows of Mac minis connected with 10GigE and you have a facility that can get apps ready quickly. No need for costly large GPUs here, all the work will be done by the CPU cores. Once set up, the Mac mini is more than happy to run in a headless mode. I've worked with a few producers like that.
This is also true of building a small Mac mini cluster to do the hard work of making all the different deliverables of an FCPX exported finished movie - while you carry on editing something else with your main machine. It would make sense for a large production or facility house to have a rack of these that everybody could access when needed.
Put five or ten of these together in a rack and you have a very fast DIT tool for making proxies and dailies on set. The Mac mini above gets its first on set DIT experience tomorrow!
Tumblr media
I’ll leave you with a final thought.
Spec up a 6-core Mac mini with 32GB of RAM, 1TB of SSD storage and the soon to ship Blackmagic RX Vega 56 eGPU and you have a machine that’s not too far from the base model iMac Pro, wait for it... with over £1,200 left spare.
Granted, you’ll have to supply your own monitor, keyboard and mouse, but if upgradability is important to you, this could be a very clever way of getting the power with the flexibility.
Hopefully we will have an eGPU to test soon :)
Peter Wiggins is a broadcast freelance editor based in the UK although his work takes him around the world. An early adopter of FCP setting up pioneering broadcasts workflows, his weapon of choice is now Final Cut Pro X.
You can follow him on Twitter as @peterwiggins or as he runs the majority of this site, you can contact him here.
Tumblr media
0 notes
charger-batteries · 3 years
Text
Dell XPS 13 (9310) Review
The newest Dell XPS 13 is one of the first ultraportable laptops to come with Intel's latest 11th Generation "Tiger Lake" CPUs, offering speedy, efficient computing performance and long battery life. This 2.8-pound laptop also has an exceptional 13.4-inch display and a gorgeous chassis, all of which combine to make it our Editors' Choice pick among premium Windows ultraportables. The price is a bit high, starting at $999.99 and ringing up at $1,649 as tested, but it's worth it for uncompromising fans of cutting-edge performance and style.
The XPS 13's Moment to Shine
If you were tempted to buy an XPS 13 earlier this year following its significant redesign (model 9300) but didn't pull the trigger, it's a good thing you waited. Now you can get everything we like about the new laptop with the added bonus of the latest Intel silicon.
The CPU bump is essentially the only change from the 9300 to the current model 9310, but it's an important one if you plan to keep your machine for five years or more. Dell does churn out new XPS 13 models at a prodigious rate, sometimes multiple times per year. Still, a brand-new processor and a physical redesign that's not even a year old make the 9310 a safe buy for people who don't want their expensive investment to be upstaged by something vastly better in a few months.
The XPS 13 is admirably thin and feels satisfyingly solid, if not particularly lightweight. It measures 0.58 by 11.6 by 7.8 inches (HWD) and weighs 2.8 pounds in the touch-screen configuration reviewed here. Versions without a touch screen weigh slightly less, at 2.64 pounds, since they lack the touch version's Gorilla Glass 6 coating over the display. Either of those weight measurements compare favorably with the Dell's archrival, the Apple MacBook Pro 13, which weighs 3.1 pounds. But the XPS 13 is still meaningfully heavier than the very lightest models on the market, such as Acer's 1.96-pound Swift 7 flagship.
The XPS 13's additional heft accommodates high-quality materials and a sleek design. The density of the build is apparent as soon as you slide the system out of its sleek white box and run your fingers over the aluminum lid and edges. Dell says the edges are anodized twice to prevent scratch damage from repeated plugging and unplugging of peripherals. Opening the lid results in even more to ogle. The two color options include Platinum Silver with a black carbon-fiber palm rest or Frost White with an Alpine White composite-fiber palm rest. Our review unit uses the latter scheme, and it's gorgeous. The palm rest is especially snazzy, and it incorporates a UV- and stain-resistant coating to prevent yellowing and discoloration.
If you like the XPS 13's styling, you’ll be interested to know Dell has expanded it across the range, which includes the latest versions of the larger XPS 15 and XPS 17 laptops. This is a similarly unified approach to the one Apple takes with the styling of the MacBook Air and the two sizes of the MacBook Pro. Whether you're looking for an ultraportable for frequent travels or a large-screen machine with serious computing power, there's an XPS for you. There's even a 2-in-1 version of the XPS 13, which sports a 360-degree hinge that lets you convert the laptop into a tablet.
An Exceptional Display, Even Without 4K
The XPS 13's display is available in three versions, all of which feature an unusual 16:10 aspect ratio instead of the more familiar 16:9. The former results in additional vertical space, which is handy for when you're scrolling through websites or updating lengthy documents.
Our review unit has a 1,920-by-1,200-pixel touch panel. Thanks to the aspect ratio, the resolution is a bit higher than full HD (1,920 by 1,080 pixels), but considerably less than the Retina Display of the MacBook Pro or the PixelSense display of the Microsoft Surface Laptop 3. However, the XPS 13 can be configured with a 3,840-by-2,400-pixel panel that leapfrogs not only those two laptops, but also the dimensions of standard widescreen 4K displays (3,840 by 2,160 pixels).
Once you’ve used a 4K screen, it's hard to go back to full HD, with its occasionally visible pixels and slightly grainy text. Perhaps that's why Apple and Microsoft don't offer full HD versions of the MacBook Pro or Surface Laptop 3. But I actually don't mind the XPS 13's screen resolution. Images appear especially vivid, which I attribute partly to the taller aspect ratio that results in slightly more pixels than a 1080p display and partly to the Dell's 100% sRGB and 90% DCI-P3 gamut support. I also appreciate the extraordinary rated maximum of 500 nits of brightness, which means the XPS 13 can even be viewed comfortably outdoors (though not in direct sunlight) if you crank up the brightness setting.
So I'm not recommending the 4K screen over the full HD one in this case. That's an added benefit to people watching their budgets, since the 4K version does add to the cost. On the other hand, I recommend staying away from Dell's entry-level screen, which is the same as the one on our review unit except that it lacks touch support. That's a shame, since many XPS 13 competitors offer touch support standard, with the notable exception of the MacBook Pro.
Feats of Miniaturization
In an impressive achievement in downsizing, a 720p webcam complete with IR face recognition sensors is located above the center of the display. It offers average video quality for a laptop camera, which is to say that indoor shots are slightly noisy and fuzzy compared with the quality from the cameras of even a midrange phone. Dell says it has improved the camera quality by adding a new four-piece lens and temporal noise reduction, but if you're planning to hold a Skype session in your living room at night, you'll still probably want to use your phone. The camera's chief innovation is its minuscule size—the XPS 13's screen occupies 91.5% of the footprint of the chassis, which means the bezels surrounding it are razor-thin.
One of the consequences of a compact laptop is less room for ports. The XPS 13's are limited to two USB Type-C ports with Thunderbolt 4 support, a headphone jack, and a microSD card reader. This means you'll need an adapter or dongle to plug in an external monitor or USB Type-A peripherals. (Dell thoughtfully includes a USB adapter in the box.) This could be a drawback for the work-from-home crowd, who will likely be using the XPS 13 with an external display. While it's true that the XPS 13 is simply following the trend toward fewer and fewer ports, its selection is stingy even among its peers. The MacBook Pro offers as many as four USB-C ports, all of which support Thunderbolt 3.
The ultraportable offers the latest Wi-Fi 6 (802.11ax) and Bluetooth 5 wireless connectivity standards, good for stable internet connections and wireless keyboards and mice. But many users will happily stick with the built-in touchpad and keyboard, both of which I find to be comfortable for short typing and tapping sessions. The large keycaps and extensive surface area of the pad are welcome improvements over the cramped equivalents on some competitors, including the Asus ZenBook 13. The power button in the upper right corner of the keyboard doubles as a fingerprint reader for password-free logins to your Windows 10 account.
Audio quality from the XPS 13's stereo speakers is excellent. Combined, they deliver up to 4 watts of output, and they're balanced enough to give the laptop far richer and more dimensional sound than you'd expect from such a compact package. Much of the audio emanates through a grille on the bottom of the laptop, but voice tracks and other treble notes in a few movie trailers that I watched never sounded muffled.
Dell supports the XPS 13 with a one-year hardware warranty, and offers optional extensions up to four years for an additional charge.
Testing the XPS 13: Goodbye Ice, Hello Tiger
The new XPS 13 ditches Intel's 10th Generation "Ice Lake" processors in favor of the latest Tiger Lake CPUs. There's not a huge difference between the two, but we did see some modest performance improvements on a few of our benchmark tests compared with the 9300 model. Our test unit comes with a Core i7-1165G7, a quad-core chip with Hyper-Threading that runs at a base frequency of 2.8GHz, up from 1.3GHz in the equivalent 10th Generation Core i7. The higher clock speed can improve performance on certain tasks, though the total number of cores and threads remains the same.
While the XPS 13 has always relied on integrated graphics rather than a discrete GPU, this model boasts Intel's latest Iris Xe silicon, replacing the Iris Plus graphics of its predecessor. Our review unit also has 16GB of memory and a 512GB solid-state drive, which should be sufficient for most users. The entry-level configuration, meanwhile, comes with a Core i3, 8GB of RAM, and a 256GB SSD. That's a relatively skimpy set of components compared with the MacBook Pro, whose entry-level configuration includes a Core i5. But the entry-level MacBook Pro is $300 more than the base XPS 13.
Below is a list of specs for our XPS 13 tester and a few other comparable laptops we've tested recently, including the Apple MacBook Pro, the Asus ZenBook 13, the Razer Blade Stealth 13, and the Microsoft Surface Laptop 3.
Of the group, the Asus is the only other contender to sport a Tiger Lake CPU.
CPU, Media, and Storage Tests
Our first look at overall performance comes from the Windows-only PCMark performance suite developed by the benchmark specialists at UL (formerly Futuremark). The PCMark 10 test we run simulates different real-world productivity and content-creation workflows. We use it to assess overall system performance for office-centric tasks such as word processing, spreadsheet jockeying, web browsing, and videoconferencing. The XPS 13 performs very well, though essentially the same as the similarly equipped ZenBook 13.
PCMark 8, meanwhile, has a storage subtest that we use to assess the speed of the system's boot drive. Like PCMark 10, it yields a proprietary numeric score (higher numbers are better). Most recent laptops with SSDs perform roughly equally well in this test, which is the case here.
Next is Maxon's CPU-crunching Cinebench R15 test, which is fully threaded to make use of all available processor cores and threads. Cinebench stresses the CPU rather than the GPU to render a complex image. The result is a proprietary score indicating a PC's suitability for processor-intensive workloads. The hierarchy on this test is clear: the Ice Lake-based Surface Laptop 3 and Blade Stealth 13 are a rung below the Tiger Lake XPS 13 and ZenBook 13. The MacBook Pro's Core i5 processor is an overachiever.
Cinebench is often a good predictor of our Handbrake video-editing trial, another tough, threaded workout that's highly CPU-dependent and scales well with cores and threads. In it, we put a stopwatch on test systems as they transcode a standard 12-minute clip of 4K video to a 1080p MP4 file. It's a timed test, and lower results are better. The XPS 13 is locked in a surprising tie for first place with the MacBook Pro, with the Asus not far behind.
We also run a custom Adobe Photoshop image-editing benchmark. Using an early 2018 release of the Creative Cloud version of Photoshop for Windows and the latest Photoshop CC release for macOS, we apply a series of 10 complex filters and effects to a standard JPEG test image. We time each operation and, at the end, add up the total execution time. As with Handbrake, lower times are better here. The Photoshop test stresses the CPU, storage subsystem, and RAM. The XPS 13 performs well—better than the Apple, but not quite as quick as the ZenBook 13.
Graphics and Battery Life Testing
One of the main benefits of Intel's Tiger Lake platform is the switch from the older Iris Plus to Iris Xe graphics. This improvement actually has only a small impact in our graphics testing, which uses the Windows-only 3DMark and Superposition game simulations to render sequences of highly detailed, gaming-style 3D graphics that emphasize particles and lighting.
We run two different 3DMark subtests, Sky Diver and Fire Strike, which are suited to different types of systems. Both are DirectX 11 benchmarks, but Sky Diver is more suited to laptops and midrange PCs, while Fire Strike is more demanding and made for high-end PCs to strut their stuff. The results are proprietary scores.
Like 3DMark, the Superposition test renders and pans through a detailed 3D scene and measures how the system copes. In this case, it's rendered in the company's eponymous Unigine engine, offering a different 3D workload scenario than 3DMark for a second opinion on the machine's graphical prowess.
The Iris Xe laptops show a slight advantage over their Iris Plus counterparts, but the difference isn't remarkable. It's also much less of an advantage than what is offered by an entry-level gaming GPU like the Nvidia processor in the Blade Stealth 13. The bottom line for graphics output is that the XPS 13 will be able to handle pretty much anything you throw at it except for intensive 3D games, which require a dedicated GPU to run smoothly. (For more analysis of Iris Xe performance, check out our Iris Xe primer and in-depth Tiger Lake testing feature.)
Equipped with a 52-watt-hour battery that lasted for 15 hours in our video playback test, the XPS 13 should easily survive an entire workday away from a power outlet...
That's a very good result, even though it's "just" in line with what the competition offers and actually slightly shorter than the 17.5 hours we saw from the previous model 9300.
Hear Me Roar: Today's Best High-End Ultraportable
The XPS 13 is an exceptional ultraportable whose key strengths are its cutting-edge computing components in a beautifully designed, exceptionally well-constructed chassis. These are rare achievements even in the crowded field of premium ultraportable laptops. Operating system differences aside, the XPS 13 is probably a better choice than the MacBook Pro for most people right now, since Apple's notebook is using older-generation processors that face imminent replacement.
One of the few reasons not to choose the XPS 13 is if you're seeking a robust gaming experience on the side. Intel's Iris Xe silicon is pretty good if you are willing to dial things back (this is not the integrated graphics of a year or two ago), but even so, the Nvidia GeForce GTX-equipped Blade Stealth 13 is likely a more prudent choice, at the minor expense of shorter battery life, a not-quite-as-sleek chassis, and an older-generation CPU. Otherwise, the XPS 13 is the cream of the crop, and retains our Editors' Choice award as 2020's best high-end ultraportable laptop to date.
0 notes
suzanneshannon · 4 years
Text
Review of the Surface Book 3 for Developers
I was offered a Surface Book 3 to use as a loaner over the the last 5 weeks. I did a short video teaser on Twitter where I beat on the device with a pretty ridiculous benchmark - running Visual Studio 2019 while running Gears of War and Ubuntu under WSL and Windows Terminal. I have fun. ;)
Hey they loaned me a @surface book 3! So...I threw EVERYTHING at it...Visual Studio, Gears of War, Ubuntu/WSL2/Windows...*all at the same time* because why not? LOL (review very soon) pic.twitter.com/FmgGCBUGuR
— Scott Hanselman (@shanselman) May 14, 2020
Size and Weight
My daily driver has been a Surface Book 2 since 2017. The new Surface Book 3 is the exact size (23mm thick as a laptop) and weight (3.38 and 4.2 lbs.) as the SB2. I have had to add a small sticker to one otherwise I'd get them confused. The display resolutions are 3000×2000 for the 13.5-inch model and 3240×2160 for the 15-inch one that I have. I prefer a 15" laptop. I don't know how you 13" people do it.
Basically if you are a Surface Book 2 user the size and weight are the same. The Surface Book 3 is considerably more power in the same size machine.
CPU and Memory
They gave me an i7-1065G7 CPU to test. It bursts happily over 3.5 Ghz (see the compiling screenshot below) and in my average usage hangs out in the 2 to 1.8 range with no fan on. I regularly run Visual Studio 2019, VS Code, Teams, Edge (new Edge, the Chromium one), Ubuntu via WSL2, Docker Desktop (the WSL2 one), Gmail and Outlook as PWAs, as well as Adobe Premiere and Audition and other parts of the Creative Suite. Memory usually sits around 14-18 gigs unless I'm rendering something big.
It's a 10th gen Intel chip and as the Surface Book 3 can detach the base from the screen, it's both a laptop and tablet. I gleaned from Anandatech that TDP is between 10 and 25W (usually 15W) depends on what is needed, and it shifts frequencies very fast. This is evident in the great battery life when doing things like writing this blog post or writing in Edge or Word (basically forever) versus playing a AAA game or running a long compile, building containers, or rendering a video in Premiere (several hours).
FLIP THE SCREEN AROUND? You can also when docked even reverse the screen! Whatever do you mean? It's actually awesome if you want an external keyboard.
All this phrased differently? It's fast, quickly, when it needs to be but it's constantly changing the clock to maximize power/thermals/battery.
SSD - Size and Speed
The device I was loaned has a Toshiba KXG60PNV2T04 Hard Drive 2TB NVMe M.2 that's MASSIVE. I'm used to 512G or maaybe a 1TB drive in a Laptop. I'm getting used to never having to worry about space. Definitely 1TB minimum these days if you want to play games AND do development.
I ran a CrystalBenchmark on the SSD and it did 3.2GB/s sequential reads! Sweet. I feel like the disk is not the bottleneck with my development compile tests below. When I consulted with the Surface team last year during the conception of the Surface Book 3 I pushed them for faster SSDs and I feel that they delivered with this 2TB SSD.
GPU - Gaming and Tensorflow
The 13.5-inch model now comes with an NVIDIA GeForce GTX 1650 Max-Q GPU with 4GB of GDDR5 memory in its Core i7 variant, while the 15-inch unit features a NVIDIA GeForce GTX 1660 Ti Max-Q with 6GB of GDDR6 memory. When running the Gears 5 Benchmark while plugged in (from the Extras menu, Benchmark) is has no issues with the default settings doing 60fps for 90% of the benchmark with a few dips into the 57 range depending what's on screen.
It's not a gaming machine, per se, but it does have a NVIDIA GeForce GTX 1660 Ti so I'm basically able to 1080p 60fps AAA games. I've played Destiny 2, Gears of War 5, and Call of Duty Modern Warfare on default settings at 60 fps without issue. The fan does turn on but it's very manageable. I like that whenever we get back into hotels I'll be able to play some games and develop on the same machine. The 15" also includes an Xbox Wireless Adapter so I just paired my controller with it directly.
I was also able to run Tensorflow with CUDA on the laptop under Windows and it worked great. I ran a model against some video footage from my dashcam and 5.1 gigs of video RAM was used immediately and the CUDA engine on the 1660Ti is visible working in Taskman. The commercial SKU has an NVIDIA Quadro RTX 3000 that is apparently even more tuned for CUDA work.
Developer Performance
When I built my Intel i9 Ultimate Desktop 3.0 machine and others, I like to do compile tests to get a sense of how much you can throw at machine. I like big project compiles because they are a combination of a lot of disk access and a lot of parallel CPU work. However, some projects do have a theoretical maximum compile speed because of the way the dependences flesh out. I like to use Orchard Core for benchmarks.
Orchard Core is a fully-featured CMS with 143 projects loaded into Visual Studio. MSBUILD and .NET Core supports both parallel and incremental builds.
A warm build of Orchard Core on IRONHEART my i9 desktop takes just under 10 seconds.
My 6 year old Surface Pro 3 builds it warm in 62 seconds.
A totally cold build (after a dotnet clean) on IRONHEART takes 33.3 seconds.
My Surface Pro 3 builds it cold in 2.4 minutes.
I'll do the same build on both my Surface Book 2 and this new Surface Book 3 to compare. I've excluded the source folders from Defender as well as msbuild.exe and dotnet.exe. I've also turned off the Indexer.
A cold build (after a dotnet clean) on this Surface Book 3 takes 46 seconds.
A warm build is 16.1 seconds
A cold build (after a dotnet clean) on my Surface Book 2 takes 115 seconds.
It's WAY faster than my Surface Book 2 which has been my daily driver when mobile for nearly 3 years!
Benchmarks are all relative and there's raw throughput, there's combination benchmarks, and all kinds of things that can "make a chart." I just do benchmarks that show if I can do a thing I did before, faster.
You can also test various guesses if you have them by adding parameters to dotnet.exe. For example, perhaps you're thinking that 143 projects is thrashing to disk so you want to control how many CPUs are used. This has 4 physical cores and 8 logical, so we could try pulling back a little
dotnet build /maxcpucount:4
The result with Orchard Core is the same, so there is likely a theoretical max as to how fast this can build today. If you really want to go nuts, try
dotnet build -v diag
And dig through ALL the timing info!
Webcam Quality
Might be odd to add this as its own section but we're all using our webcams constantly right now. I was particularly impressed with the front-facing webcam. A lot of webcams are 720p with mediocre white balance. I do a lot of video calls so I notice this stuff. The SB3 has a 1080p front camera for video and decent light pickup. When using the Camera app you can do up to 5MP (2560x1920) which is cool. Here's a pic from today.
Ports and Power and Sound and Wi-Fi
The Surface Book 3 has just one USB-C port on the right side and two USB 3.1 Gen 2s on the left. I'd have liked one additional USB-C so I could project on stage and still have one additional USB-C available...but I don't know what for. I just want one more port. That said, the NEW Surface Dock 2 adds FOUR USB-C ports, so it's not a big deal.
It was theoretically possible to pull more power on the SB2 than its power supply could offer. While I never had an issue with that, I've been told by some Destiny 2 players and serious media renderers that it could happen. With the SB3 they upped the power supply with 65W for the base 13.5-inch version and a full 127W for the 15-inch SKUs so that's not an issue any more.
I have only two Macs for development and I have no Thunderbolt devices or need for an eGPU so I may not be the ideal Thunderbolt consumer. I haven't needed it yet. Some folks have said that it's a bummer the SB3 doesn't have it but it hasn't been an issue or sticking point for any of my devices today. With the new Surface Dock 2 (below) I have a single cable to plug in that gives me two 4k monitors at 60Hz, lots of power, 4 USB-C ports all via the Dock Connector.
I also want to touch on sound. There is a fan inside the device and if it gets hot it will run. If I'm doing 1080p 60fps in Call of Duty WarZone you can likely hear the fan. It comes and goes and while it's audible when the fan is on, when the CPU is not maxed out (during 70% of my work day) the Surface Book 3 is absolutely silent, even when running the monitors. The fan comes on with the CPU is bursting hard over 3Ghz and/or the GPU is on full blast.
One other thing, the Surface Book 3 has Wi-Fi 6 even though I don't! I have a Ubnt network and no Wi-Fi 6 mesh points. I haven't had ANY issues with the Wi-Fi on this device over Ubnt mesh points. When copying a 60 gig video file over Wi-Fi from my Synology NAS I see sustained 280 megabit speeds.
The New Surface Dock - Coming May 26th
I'm also testing a pre-release Surface Dock 2. I suspect they wanted me to test it with the Surface Book 3...BUT! I just plugged in every Surface I have to see what would happen.
My wife has a Surface Laptop 2 she got herself, one son has my 6 year old old Surface Pro 3 while the other has a Surface Go he got with his allowance. (We purchased these over the last few years.) As such we have three existing Surface Docks (original) - One in the kids' study/playroom, one in the Kitchen as a generalized docking station for anyone to drop in to, and one in my office assigned me by work.
We use these individual Surfaces (varying ages, sizes, and powers) along with my work-assigned Surface Book 2 plus this loaner Surface Book 3, so it's kind of a diverse household from a purely Surface perspective. My first thought was - can I use all these devices with the new Dock? Stuff just works with a few caveats for older stuff like my Surface Pro 3.
RANDOM NOTE: What happens when you plug a Surface Pro 3 (released in 2014) into a Surface Dock 2? Nothing, but it does get power. However, the original Surface Dock is great and still runs 4096 x 2160 @30Hz or 2960 x 1440 @60Hz via mini DisplayPort so the Pro 3 is still going strong 6 years out and the kids like it.
So this Surface Dock 2 replaces the original Dock my office. The Surface Dock 2 has
2x front-facing USB-C ports (I use these for two 4k monitors)
2x rear-facing USB-C ports
2x rear-facing USB-A 3.2 (10Gbps) ports
1x Gigabit Ethernet port
1x 3.5mm audio in/out port
Kensington lock slot - I've never used this
First, that's a lot of USB-C. I'm not there yet with the USB-C lifestyle, but I did pick up two USB-C to full-size DisplayPort cables at Amazon and I can happily report that I can run both my 4k monitors at 60hz plus run the main Surface Book 3 panel. The new Dock and its power supply can push 120 watts of power to the Surface with a total of 199 watts everything connected to the dock. I've got a few USB-C memory sticks and one USB-C external hard drive, plus the Logitech Brio is USB 3, so 6 total ports is fine with 4 free after the two monitors. I also Gigabit wired the whole house so I use the Ethernet port quite happily.
Initially I care about one thing - my 4k monitors. Using the USB-C to DisplayPort cables I plugged the dock into two Dell P2715Q 4ks and they work! I preferred using the direct cables rather than any adapters, but I also tested a USB-C to HDMI 2.0 adapter I got in 2018 with some other Dell monitors in the house and that worked with the Surface Book 3 as it had previously with the Book 2.
SURPRISE NOTE: How does the super-thin Surface Pro X do when plugged into a Surface Dock 2? Amazing. It runs two 4k monitors at 60 Hz. I don't know why I was shocked, it's listed on the support page. It's a brand new device, but it's also the size and weight of an iPad so I was surprised. It's a pretty amazing little device - I'll do another post on just the ARM-based Surface Pro X another time.
One final thing about the new Dock. The cable is longer! The first dock had a cable that was about 6" too short and now it's not. It's the little things and in this case, a big thing that makes a Dock that much nicer to use.
Conclusion
All in all, I'm very happy with this Surface Book 3 having been an existing Surface Book 2 user. It's basically 40-50% faster, the video card is surprisingly capable. The SSD is way faster at the top end. It's a clear upgrade over what I had before, and when paired with the Surface Dock 2 and two 4k monitors it's a capable developer box for road warriors or home office warriors like myself.
Sponsor: Have you tried developing in Rider yet? This fast and feature-rich cross-platform IDE improves your code for .NET, ASP.NET, .NET Core, Xamarin, and Unity applications on Windows, Mac, and Linux.
© 2020 Scott Hanselman. All rights reserved.
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
      Review of the Surface Book 3 for Developers published first on https://deskbysnafu.tumblr.com/
0 notes
philipholt · 4 years
Text
Review of the Surface Book 3 for Developers
I was offered a Surface Book 3 to use as a loaner over the the last 5 weeks. I did a short video teaser on Twitter where I beat on the device with a pretty ridiculous benchmark - running Visual Studio 2019 while running Gears of War and Ubuntu under WSL and Windows Terminal. I have fun. ;)
Hey they loaned me a @surface book 3! So...I threw EVERYTHING at it...Visual Studio, Gears of War, Ubuntu/WSL2/Windows...*all at the same time* because why not? LOL (review very soon) pic.twitter.com/FmgGCBUGuR
— Scott Hanselman (@shanselman) May 14, 2020
Size and Weight
My daily driver has been a Surface Book 2 since 2017. The new Surface Book 3 is the exact size (23mm thick as a laptop) and weight (3.38 and 4.2 lbs.) as the SB2. I have had to add a small sticker to one otherwise I'd get them confused. The display resolutions are 3000×2000 for the 13.5-inch model and 3240×2160 for the 15-inch one that I have. I prefer a 15" laptop. I don't know how you 13" people do it.
Basically if you are a Surface Book 2 user the size and weight are the same. The Surface Book 3 is considerably more power in the same size machine.
CPU and Memory
They gave me an i7-1065G7 CPU to test. It bursts happily over 3.5 Ghz (see the compiling screenshot below) and in my average usage hangs out in the 2 to 1.8 range with no fan on. I regularly run Visual Studio 2019, VS Code, Teams, Edge (new Edge, the Chromium one), Ubuntu via WSL2, Docker Desktop (the WSL2 one), Gmail and Outlook as PWAs, as well as Adobe Premiere and Audition and other parts of the Creative Suite. Memory usually sits around 14-18 gigs unless I'm rendering something big.
It's a 10th gen Intel chip and as the Surface Book 3 can detach the base from the screen, it's both a laptop and tablet. I gleaned from Anandatech that TDP is between 10 and 25W (usually 15W) depends on what is needed, and it shifts frequencies very fast. This is evident in the great battery life when doing things like writing this blog post or writing in Edge or Word (basically forever) versus playing a AAA game or running a long compile, building containers, or rendering a video in Premiere (several hours).
FLIP THE SCREEN AROUND? You can also when docked even reverse the screen! Whatever do you mean? It's actually awesome if you want an external keyboard.
All this phrased differently? It's fast, quickly, when it needs to be but it's constantly changing the clock to maximize power/thermals/battery.
SSD - Size and Speed
The device I was loaned has a Toshiba KXG60PNV2T04 Hard Drive 2TB NVMe M.2 that's MASSIVE. I'm used to 512G or maaybe a 1TB drive in a Laptop. I'm getting used to never having to worry about space. Definitely 1TB minimum these days if you want to play games AND do development.
I ran a CrystalBenchmark on the SSD and it did 3.2GB/s sequential reads! Sweet. I feel like the disk is not the bottleneck with my development compile tests below. When I consulted with the Surface team last year during the conception of the Surface Book 3 I pushed them for faster SSDs and I feel that they delivered with this 2TB SSD.
GPU - Gaming and Tensorflow
The 13.5-inch model now comes with an NVIDIA GeForce GTX 1650 Max-Q GPU with 4GB of GDDR5 memory in its Core i7 variant, while the 15-inch unit features a NVIDIA GeForce GTX 1660 Ti Max-Q with 6GB of GDDR6 memory. When running the Gears 5 Benchmark while plugged in (from the Extras menu, Benchmark) is has no issues with the default settings doing 60fps for 90% of the benchmark with a few dips into the 57 range depending what's on screen.
It's not a gaming machine, per se, but it does have a NVIDIA GeForce GTX 1660 Ti so I'm basically able to 1080p 60fps AAA games. I've played Destiny 2, Gears of War 5, and Call of Duty Modern Warfare on default settings at 60 fps without issue. The fan does turn on but it's very manageable. I like that whenever we get back into hotels I'll be able to play some games and develop on the same machine. The 15" also includes an Xbox Wireless Adapter so I just paired my controller with it directly.
I was also able to run Tensorflow with CUDA on the laptop under Windows and it worked great. I ran a model against some video footage from my dashcam and 5.1 gigs of video RAM was used immediately and the CUDA engine on the 1660Ti is visible working in Taskman. The commercial SKU has an NVIDIA Quadro RTX 3000 that is apparently even more tuned for CUDA work.
Developer Performance
When I built my Intel i9 Ultimate Desktop 3.0 machine and others, I like to do compile tests to get a sense of how much you can throw at machine. I like big project compiles because they are a combination of a lot of disk access and a lot of parallel CPU work. However, some projects do have a theoretical maximum compile speed because of the way the dependences flesh out. I like to use Orchard Core for benchmarks.
Orchard Core is a fully-featured CMS with 143 projects loaded into Visual Studio. MSBUILD and .NET Core supports both parallel and incremental builds.
A warm build of Orchard Core on IRONHEART my i9 desktop takes just under 10 seconds.
My 6 year old Surface Pro 3 builds it warm in 62 seconds.
A totally cold build (after a dotnet clean) on IRONHEART takes 33.3 seconds.
My Surface Pro 3 builds it cold in 2.4 minutes.
I'll do the same build on both my Surface Book 2 and this new Surface Book 3 to compare. I've excluded the source folders from Defender as well as msbuild.exe and dotnet.exe. I've also turned off the Indexer.
A cold build (after a dotnet clean) on this Surface Book 3 takes 46 seconds.
A warm build is 16.1 seconds
A cold build (after a dotnet clean) on my Surface Book 2 takes 115 seconds.
It's WAY faster than my Surface Book 2 which has been my daily driver when mobile for nearly 3 years!
Benchmarks are all relative and there's raw throughput, there's combination benchmarks, and all kinds of things that can "make a chart." I just do benchmarks that show if I can do a thing I did before, faster.
You can also test various guesses if you have them by adding parameters to dotnet.exe. For example, perhaps you're thinking that 143 projects is thrashing to disk so you want to control how many CPUs are used. This has 4 physical cores and 8 logical, so we could try pulling back a little
dotnet build /maxcpucount:4
The result with Orchard Core is the same, so there is likely a theoretical max as to how fast this can build today. If you really want to go nuts, try
dotnet build -v diag
And dig through ALL the timing info!
Webcam Quality
Might be odd to add this as its own section but we're all using our webcams constantly right now. I was particularly impressed with the front-facing webcam. A lot of webcams are 720p with mediocre white balance. I do a lot of video calls so I notice this stuff. The SB3 has a 1080p front camera for video and decent light pickup. When using the Camera app you can do up to 5MP (2560x1920) which is cool. Here's a pic from today.
Ports and Power and Sound and Wi-Fi
The Surface Book 3 has just one USB-C port on the right side and two USB 3.1 Gen 2s on the left. I'd have liked one additional USB-C so I could project on stage and still have one additional USB-C available...but I don't know what for. I just want one more port. That said, the NEW Surface Dock 2 adds FOUR USB-C ports, so it's not a big deal.
It was theoretically possible to use pull power on the SB2 than it's power supply could push. While I never had an issue with that, I've been told by some Destiny 2 players and serious media renderers that it could happen. With the SB3 they upped the power supply with 65W for the base 13.5-inch version and a full 127W for the 15-inch SKUs so that's not an issue any more.
I have only two Macs for development and I have no Thunderbolt devices or need for an eGPU so I may not be the ideal Thunderbolt consumer. I haven't needed it yet. Some folks have said that it's a bummer the SB3 doesn't have it but it hasn't been an issue or sticking point for any of my devices today.
I also want to touch on sound. There is a fan inside the device and if it gets hot it will run. If I'm doing 1080p 60fps in Call of Duty WarZone you can likely hear the fan. It comes and goes and while it's audible when it's one, when the CPU is not maxed out (during 70% of my work day) the Surface Book 3 is absolutely silent, even when running the monitors. The fan comes on with the CPU is bursting hard over 3Ghz and/or the GPU is on full blast.
One other thing, the Surface Book 3 has Wi-Fi 6 even though I don't! I have a Ubnt network and no Wi-Fi 6 mesh points. I haven't had ANY issues with the Wi-Fi on this device over Ubnt mesh points. When copying a 60 gig video file over Wi-Fi from my Synology NAS I see sustained 280 megabit speeds.
The New Surface Dock - Coming May 26th
I'm also testing a pre-release Surface Dock 2. I suspect they wanted me to test it with the Surface Book 3...BUT! I just plugged in every Surface I have to see what would happen.
My wife has a Surface Laptop 2 she got herself, one son has my 6 year old old Surface Pro 3 while the other has a Surface Go he got with his allowance. (We purchased these over the last few years.) As such we have three existing Surface Docks (original) - One in the kids' study/playroom, one in the Kitchen as a generalized docking station for anyone to drop in to, and one in my office assigned me by work.
We use these individual Surfaces (varying ages, sizes, and powers) along with my work-assigned Surface Book 2 plus this loaner Surface Book 3, so it's kind of a diverse household from a purely Surface perspective. My first thought was - can I use all these devices with the new Dock? Stuff just works with a few caveats for older stuff like my Surface Pro 3.
RANDOM NOTE: What happens when you plug a Surface Pro 3 (released in 2014) into a Surface Dock 2? Nothing, but it does get power. However, the original Surface Dock is great and still runs 4096 x 2160 @30Hz or 2960 x 1440 @60Hz via mini DisplayPort so the Pro 3 is still going strong 6 years out and the kids like it.
So this Surface Dock 2 replaces the original Dock my office. The Surface Dock 2 has
2x front-facing USB-C ports (I use these for two 4k monitors)
2x rear-facing USB-C ports
2x rear-facing USB-A 3.2 (10Gbps) ports
1x Gigabit Ethernet port
1x 3.5mm audio in/out port
Kensington lock slot - I've never used this
First, that's a lot of USB-C. I'm not there yet with the USB-C lifestyle, but I did pick up two USB-C to full-size DisplayPort cables at Amazon and I can happily report that I can run both my 4k monitors at 60hz plus run the main Surface Book 3 panel. The new Dock and its power supply can push 120 watts of power to the Surface with a total of 199 watts everything connected to the dock. I've got a few USB-C memory sticks and one USB-C external hard drive, plus the Logitech Brio is USB 3, so 6 total ports is fine with 4 free after the two monitors. I also Gigabit wired the whole house so I use the Ethernet port quite happily.
Initially I care about one thing - my 4k monitors. Using the USB-C to DisplayPort cables I plugged the dock into two Dell P2715Q 4ks and they work! I preferred using the direct cables rather than any adapters, but I also tested a USB-C to HDMI 2.0 adapter I got in 2018 with some other Dell monitors in the house and that worked with the Surface Book 3 as it had previously with the Book 2.
SURPRISE NOTE: How does the super-thin Surface Pro X do when plugged into a Surface Dock 2? Amazing. It runs two 4k monitors at 60 Hz. I don't know why I was shocked, it's listed on the support page. It's a brand new device, but it's also the size and weight of an iPad so I was surprised. It's a pretty amazing little device - I'll do another post on just the ARM-based Surface Pro X another time.
One final thing about the new Dock. The cable is longer! The first dock had a cable that was about 6" too short and now it's not. It's the little things and in this case, a big thing that makes a Dock that much nicer to use.
Conclusion
All in all, I'm very happy with this Surface Book 3 having been an existing Surface Book 2 user. It's basically 40-50% faster, the video card is surprisingly capable. The SSD is way faster at the top end. It's a clear upgrade over what I had before, and when paired with the Surface Dock 2 and two 4k monitors it's a capable developer box for road warriors or home office warriors like myself.
Sponsor: Have you tried developing in Rider yet? This fast and feature-rich cross-platform IDE improves your code for .NET, ASP.NET, .NET Core, Xamarin, and Unity applications on Windows, Mac, and Linux.
© 2020 Scott Hanselman. All rights reserved.
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
      Review of the Surface Book 3 for Developers published first on http://7elementswd.tumblr.com/
0 notes
Text
Radeon RX 580: Review 2021 | Testing| Specs | Profit ( Good and Bad Side)
Tumblr media
Radeon RX 580 Cryptocurrency Mining: Review 2021 | Testing: Specs | Profit | Hashrate | Settings | Setup| Configuration | CPU Performance: Rx 580 mining - Check out the specification, hashrate, profitability and the payback period of this miner as well as other critical information before buying. After the release of the Radeon RX 580 video card, the entire five hundredth series became one of the most profitable options, not only for gamers, but also for mining. Cards bought up immediately after their appearance in the store and the resulting shortage significantly influenced their price.  The cost of the RX 580 even from the “cheapest” vendors has grown by at least 30-40%. This greatly increased the payback period of the video card and shaken the leading position of the video card. Please note- The Review was done earlier than 2021 so their will be variation inters of the rate would earn when currency exchange is taken into consideration. Is the RX 580 the best and most promising choice for miners in 2018, and is it worth paying attention to this video card today when building a farm from scratch? In this article we will look at all the features, calculations and potential of the card for mining today and the future. People Also Ask: Common Questions Radeon RX 580 Specifications and power Radeon RX 580 First, let's briefly review the technical characteristics of the card, which will help to understand its relevance for mining and a place in the top. There are two versions of the RX 580, 4 and 8 gigabytes of memory. For the rest of the characteristics, there are almost no differences between the cards, and the relevance of the “extra” 4 GB of memory will be considered in the following blocks. - Graphic Engine: AMD Radeon RX 580 - Bus Standard: PCI Express 3.0 - Video Memory: 8GB GDDR5 - Engine Clock:  - 1380 MHz (OC Mode) - 1360 MHz (Gaming Mode) - Stream Processors: 2304 - Memory Speed: 8 Gbps - Memory Interface: 192-bit - Resolution: Digital Max RTop 10 Questions Asked on Amazon Before Buying - Geforce GTX 1060esolution:7680x4320 - Interface: - 1 x DVI-D - 2 x HDMI 2.0b - 2 x DisplayPort 1.4 - HDCP Support: Yes - Maximum Display Support: 4 - Software:ASUS GPU Tweak II  - Dimensions: 9.53 " x 5.07 " x 1.49 " Inch - Recommended PSU: 500W - Power Connectors: 1 x 8-pin - Slot: 2 Slot If you take the most reference version of the RX 580 NITRO from Sapphire, then the characteristics will be as follows: - The core frequency is 1340 MHz for silent mode and 1411 MHz for maximum boost. - Memory - 8,192 GB with a 255-bit bus and a frequency of 2 MHz. - 225 watts peak power. However, this is the most powerful map, which is dispersed from the factory to the maximum. Solutions from other vendors will have lower frequencies, and not only their overclocking potential will be important here, but also some of the manipulations associated with the firmware. The cornerstone of mining is overclocking. In terms of importance, this criterion is second only to the price of the video card itself, which determines the payback and the ratio of income and investment. As is the case with other video cards, everything will depend on the memory manufacturer. In 2018, it was almost impossible to find even the top NITRO and Limited Edition solutions with Samsung memory. Most of the cards come with Hynix memory, whose overclocking potential is significantly inferior, which will certainly affect the overall profit of the farm in mining. This indicator is not critical, but in the case of a possible choice, you should always give preference to Samsung video cards. Top 10 Questions Asked on Amazon Before Buying - Radeon RX 580 XFX Radeon RX 580 GTS XXX Edition 1386MHz OC+, 8GB GDDR5, VR Ready, Dual BIOS, 3xDP HDMI DVI, AMD Graphics Card (RX-580P8DFD6) Question 1: What would this equal to a GeForce card? Answer: Between 1060 and 1070 Question 2: Is this one could build in a mini case? Answer: No, it barely fit in my mid-size NZXT with no modification needed Question 3: Is it with upgrading my nvidia gtx 1050ti 4 gb gddr5 direct 12 graphics card you this one?? Answer: Yes, that's what i had before too gtx 1050ti replaced with this XFX RX 580 GTS XXX Question 4: Will the Radeon RX 580 card work with an msi h110m atx gaming motherboard? Answer: Yes, all you need is a PCI-Express 3.0 x16 Slot, and be sure your PC case is big enough to support a GPU of this size. Question 5: Can I play rainbow six siege with 8gb or 4gb? Answer: That is something you need to find out. Look at the system requirements for the game. Easy Google search. Question 6: It shows that this gpu needs 500 watt. I have a sonnet egfx breakaway box 550w with 80% efficient. it means it only have 440w. so can i use this gpu? Answer: Yes. ( Customer Answered: I have a 750W PSU, the card causes random reboots, it is a terrible design that overloads the 8-pin rail. Check the internet for issues with this particular manufacturer.) Question 7: Does this card come with free games? Answer: It did, when purchased it Question 8: Hey I am just worried , so I ordered nzxt mid tower and will this video card fit in?? Answer: It, Should. Question 9: The Radeon RX 580, is this card compatible with a dell xps 8700? Answer: Sure it does, as long as you have a case that will fit the card. Question 10: I have a ryzen 5 2400g will the rx 580 8gb be better than the gtx 1060 ti 3gb or 6gb for gaming? Answer: I average 120 - 150 fps in rainbow 6 siege if that helps? that on 144hz monitor: ( Customer had this to says as well - The 580 would be a better match because it is around the gtx 1070 in performance.) Best Review Posted on Amazon Before Buying - Radeon RX 580 Best Review Posted on Amazon Before Buying - Radeon RX 580 Customer Review 1 of Radeon RX 580: This card can handle games(destiny, fortnite, Pubg...etc) at high FPS with no issues. The card does use more power than an nvidia card with similar specs and also creates more heat. But at the same time value for the spec is great. As a product I would give it 4 stars however, XFX's warranty service is surprisingly easy and fast. My card broke after a year of use, so I registered the product on their site, I received a response within 24 hours. They troubleshooted and determined the card needs to be RMAed, I sent the card back and I received a new one within a week after they received the defective card. They didn't even require the receipt(required by most companies) even though I had it. It's amazing service compare to my PNY nvidia experience which was like pulling teeth. I will definitely buy more XFX products in the future. Customer Review 2 of Radeon RX 580: Used to be a big Nvidia fan. Then, I started encountering problems with their drivers. Fine, rolled back my driver to an older version. Then, my GTX 970 stopped working after only 1 year of use. Fine, sent it in and received a refurbished one. The refurbished one now has the same issue. Bought the XFX GTS RX 580 8GB DDR5. 0 driver issues. Card still works after a few months and puts out a better picture then my 970. I'll update my review if something goes wrong but, for now, I'm extremely pleased with this card both in terms of performance and price. I'll be all to happy to continue buying AMD in future. Customer Review 3 of Radeon RX 580: Shipping arrive on time and before I got home, which is a first. A few things to note: 1: This is a fairly large GPU, if you don't have a Full Tower, or an opened-air mobo, your mileage with fitting this thing in a Mid Tower WILL differ 2: My old GPU require two 6-pin, while this GPU required one 8-pin, you will either need an 8-pin or a 6+2 pin to connect your power supply to your GPU, if you don't have a powersupply that you modify, well you will need a new one. 3: For the price, this GPU is very strong, even if it is only a rehash of the 480... and being a year old. It can still run newer titles like Monster Hunter: World without too many problems. 4: Price-wise, having to compete with cryto-miners may increase the price of this GPU from time to time... very annoying. 5: This is a major update from my old HD 7950 (which still works for many games that are moderately intense and are new.) Worst Review Posted on Amazon Before Buying - Radeon RX 580 Bad Customer Review of the Radeon RX 580 on Amazon: After about 2 months of having this card, it had fried itself. I was playing fallout when my computer crashed and upon trying to start it up, my USB keyboard and mouse would get power for their LED's but the computer would not start, no fans would spin. A few days later I tried it again to which sparks shot out of my video card. It would have likely fried my whole computer if I hadn't unplugged my power strip from the wall. I later tried another video card that a friend gave me to which I downloaded the drivers for and it worked pretty much fine. Very disappointed in this product as it nearly ruined my entire computer. Bad Customer Review 2 of the Radeon RX 580 on Amazon: When playing intensive full screen games or benchmarks two of the three display ports have issues with intermittent black screens. Confirmed on TWO copies of this card (I bought one, it started doing it, exchanged the card for another, same exact issues). Have spent a decent amount of time doing all the normal troubleshooting (fresh install of Windows, clean and re-installation of AMD drivers, manually playing with the voltage and frequency of the card, etc). Please note: In spite of the bad review by a few customers, we experience no such issues (not saying you won't) but We would comfortably recommend this product for mining cryptocurrency based on our testing that will be continue below. How to increase the potential of the Radeon RX 580 in mining Even in the case of good versions of the RX 580 (Pulse and others), the potential of the card out of the box can hardly be called incredible. If you look at the average power of the video card, excluding the top solutions with good factory overclocking, you can count on the following indicators: - Equihash - 302-310 SOL / s. - X11gost - 8.4-8.6 MH / s. - Daggerhashimoto - 26.1-26.8 MH / s. - Pascal - 0.85 GH / s. The best solution for current maps from Radeon is Ethereum mining. It is during its mining that the maximum potential of the cards is revealed, therefore any trusses with the RX 500 series are usually collected for this critical currency. Mine others is impractical, it is a direct loss of profit. Initially, most of the 580s give from 18 to 22.5 MH / s without overclocking. Against the background of the possible 30-31 MH / s, which are sought by the owners of Radeon farms with the older model of the 500th series, the card mining capacity in the factory settings is not too high. However, a prerequisite for obtaining a better result is the BIOS firmware flashing. This can be called one of the conditional minuses of AMD cards, due to which some miners choose NVidia. However, with a successful acceleration and reduction of energy consumption, the yield of the RX 580 will be quite high, especially with price increases and the overall attractiveness of Ethereum. Therefore, in the absence of sufficient knowledge, it is better to entrust the firmware to specialists. They will make it quickly and without risk to get the so-called “brick”, that is, a “dead” card without the possibility of returning it under warranty. After flashing, the average potential of video cards will increase from 18-22 MH / s to minimum 26.5 MH / s. And in the case of good memory and successful overclocking, it is quite possible to get 28-30 MH / s, in which the RX 580 becomes one of the best in terms of return on video cards. On average, experts recommend focusing on 10-15% acceleration on average. This is the best indicator in terms of temperature, energy consumption and output power in the production of cryptocurrency. Choosing the best OS - Radeon Rx 580 The correct choice of operating system can significantly simplify work with the farm and even reduce costs. In the confrontation of Windows and Linux for farms on the RX 580 usually choose the latest OS. Hive OS has several important advantages over Windows, the most significant of which are: - There is no limit on 8 video cards. - No need to face the difficult process of choosing the right drivers. - The OS was originally developed and adapted for mining. - Ability to work without monitors (using emulators). - No need to have an SSD or HDD, a regular 8-16 GB flash drive is sufficient. - WatchDog is built into the system, you do not need to pay extra for it (as is the case with Windows). - Easy setup and functional remote monitoring. - No need to buy an expensive license. - Telegram notifications in real time. You also need to consider that in case of installation of pirated copies of Windows, the farm can be confiscated, as it is stipulated by law. With Linux, there will be no such problems, which is another important advantage. One of the few advantages of Windows, namely the ability to use a video card for tasks other than mining, is irrelevant, because 99% of the farms are originally created specifically for mining cryptocurrency. This is the main task. Farm payback with RX 580 One of the most important criteria when choosing a video card for mining is their payback. It depends on many conditions. Some of them have variable values, that is, they can constantly change. This concerns the recession or growth of the cryptocurrency rate. Taking into account the relevance of Ethereum, the total payback period for the RX 580 can be 7-9 months, in case of a good time (accumulation of currency on the wallet and sale after peak rises), and 12-15 months. This period is influenced by the following factors (in descending order from the most significant): - The course of the ether. - Firmware (that is, the disclosure of the potential of a particular card in terms of overclocking, power consumption, etc.). - The original cost of cards. - Total investment in the farm (any savings, for example, on the HDD, reduces costs and accelerates payback a little). Experienced miners recommend not to display a fixed broadcast systematically. This is best done only at times when there is a strong growth rate. In general, even under adverse conditions, the return on video card in 2018 is approximately 15-18 months. Given the current conditions for mining and farm profitability, this is not the worst time. The choice between the 4 GB and 8 GB versions for the Radeon RX 580 is almost always unambiguous. If for the GTX 1060 it is possible to consider the 3 GB version as very promising, then for AMD, on which usually the broadcast is mined, only the 8 GB version would be preferable. Of course, after the miner exceeds the allowable amount in the 4 GB version, you can switch to mining other cryptocurrencies, but this is not the best option. In such conditions, it will not be possible to extract the air on 4GB cards after February 6, 2021. Taking into account the not so big difference in prices between 4 and 8 GB, in most cases it is better to take a video card with a large amount of memory for the long term. As far as vendors are concerned, there is not much difference. Buying top solutions, for example, NITRO, is not always profitable. Despite the fact that they provide good cooling and increased power, their cost is much higher than that of the "regular" versions. It is much better to buy Asus Dual or other cards in the same price category. The difference between them, in comparison with the NITRO and LE versions, is about $ 150, but it is almost completely leveled by the correct firmware, pushing all the cards to approximately the same hashrate (+ - 3-4%). Conclusion Despite the fact that the Radeon RX 580 is no longer a top-notch solution, the positions of the video card are still preserved. This is one of the most sensible choices for mining in the medium and long term. With the right approach to buying cards for the farm and selling the naming ether, you can reduce the payback by almost half, making the RX 580 a leader in this criterion. The only significant drawback may be mandatory BIOS firmware. But this issue is easily solved with the help of specialists, because this deficiency can be considered conditional. HOW TO is the best main RX 580 graphics card... XFX Radeon RX 580 GTS XXX Edition 1386MHz OC+, 8GB GDDR5, VR Ready, Dual BIOS, 3xDP HDMI DVI, AMD Graphics Card (RX-580P8DFD6) RX 580 8GB Test in 25 Games in 2020 https://www.youtube.com/watch?v=Je2BWKkkRK0 RX 580 series graphics card AMD and profitable mining - compatible? For lovers of cryptocurrency production, one performance at an exponential speed or another video card with a priority is increasingly an issue in our country. In addition to paying electricity bills and law enforcement officers, who are increasingly asking the president and the government to introduce cryptocurrency on the part of the state monopoly may not be legitimate. Although the mining system is systematically efficient in the process of revenge (this is the longer the time, the longer the "production" of a unit cryptocurrency), the Russian miners are still committed to the same, if in the yard of 2012. Performance RX 580 Series AMD's products are firmly in the leadership of mining fans for many years in a continuous circle. Although their main competitor, Nvidia, has significantly improved the compatibility of mining cards during the five-year period, the company's main support is still the gaming industry and other media. The RX series of 8Gb 580 is updated with the Radeon RX 570 and put into production in 2017 and has picked up a large number of manufacturers. The most popular versions are the models released by "Sapphire" and "EM-ES-AJ". The most popular model series - Features At the end of the summer of 2017, the main mode of this line proved the sapphire NITRO. In Western Europe, the player miner offended all the last tilted copies as the AMD Radeon RX 580, so this card snatched pereproyti "Witcher 3" from the store for the first time with a good frame rate per second. What can please our RX5808 g: -  a clock frequency of 1450 MHz for the possibility of diffusion; -  The impressive number of stream processors - 2304; -  GDDR5 memory is 8192 megaGB, with a frequency of 2000 MHz. In the numerous tests conducted by independent publications of computer hardware, we did not see the special adaptation of this graphics card mining. Many experts even suspect that the commercial success of the card may be caused by "popular rumors" than any actual benefit. When digging "Ether" (Vendetta, ETH for short), with the current version of the miner Claymore double ETH (firmware 9.2), our "Radeon" I give all 22, 5 MHS. It is interfered with the factory settings without a memory or mining calculator. Of course, the result is unqualified. Obviously, memory timing is a high clock frequency that may have to be kept visible by production attenuating. We don't even dare to imagine - it will be 4GB of RAM in this one more budget RX 580. And for the RX 580 Nitro and Western and Russian experts agree that the main news sapphire Radeon revenge enough speed. Perhaps the so-called sapphire pulse is expected to update the model to change the situation of the Meining Ç580. What is the competition? From a little more known to us, the quotation of the MSI RX 580 has appeared on our new version of the "Sapphire". Only when the temperature reaches 60 ° C ventilation system and armor technology in the performance of the company's lead blade cooler it can not be more than those who buy cards to revenge or not. Maintaining a good and strong air supply cycle keeps the investment in the field. In contrast, whether it is the "Sapphire" and "ASUS" versions, this card is able to switch memory modes. What, again, it is more important for gamers and office workers. Read the full article
0 notes
hafizhamza313 · 5 years
Text
MacBook Pro 2019: 16-inch MacBook Pro Release Date, News and Rumors
Tumblr media
The MacBook Pro 2019 refresh is an upgrade from its 2018 predecessor, but there are still many MacBook users that are getting fed up with recent moves that Apple’s made. The Cupertino company has been releasing new MacBook Pros and other Apple devices faster than we can replenish our bank accounts. And, while some of these upgrades have resulted in better devices – hats off to you, Mac mini – many of them are falling a bit flat. These updates are all due to Apple changing the way it approached the MacBook Pro back in 2016, focusing on features that make one of the best Macs more accessible to everyday users by further streamlining the design. This is perhaps why Apple got rid of all the non-Thunderbolt 3 ports. And, the MacBook Pro is certainly a case of ‘if it ain’t broke, don’t fix it’. However, there are plenty of old-school Apple users that are being alienated by Apple’s latest design philosophy. To be fair, the MacBook Pro 2018 did benefit from a significant internal revamp, now rocking 8th-generation Coffee Lake processors and some of the fastest SSDs we’ve seen to date, giving it unprecedented speed. Additionally, Apple has improved the display with True Tone tech, as well as fixed the issues with the Butterfly keyboard – or some of them at least. It did all that while maintaining the impressive battery levels of the 2017 model. As of July 2019, these are true as well for the 13-inch MacBook Pro entry-level model, which – thanks to a recent refresh – now touts the 8th-generation Intel Core i5 chip, True Tone Retina display and the improved keyboard – not to mention the Touch Bar, Touch ID and Apple’s T2 security chip. Even better, Apple has also released 15-inch MacBook Pro configurations that tout the latest 8-core Intel Core i9, Intel’s latest 9th generation processor (CPU). These configurations are now Apple’s top-of-the-line MacBook Pros, the cheapest of which supports the 2.3GHz 8‑core 9th‑generation Intel Core i9 (Turbo Boost up to 4.8GHz) CPU with 16GB memory (RAM), a Radeon Pro 560X graphics (GPU) and 512GB SSD storage at $2,799 (£2,699, A$4,099). What do these latest improvements mean for the 16-inch MacBook Pro 2019 we've also been anticipating? Recent rumours have reinforced the possibility of a true redesign gracing us with its presence this year, perhaps alongside the new Mac Pro 2019. However, Apple might opt to roll out a completely new model in 2020 or 2021. Regardless, the true MacBook Pro 2019 could still happen, especially because the 12-inch MacBook was recently dropped from the lineup. Famed Apple analyst Ming-Chi Kuo has produced some leaks that suggest Apple is going to release a 16-inch MacBook Pro 2019 with an “all-new” design, and that could still happen this year. Plus, we’re still keeping our fingers crossed they’ll give that still-pesky keyboard a redesign, and recent reports are looking good in that regard. If the completely overhauled MacBook Pro 2019 is still in the works, here are the things we’re hoping to see. Cut to the chase What is it? The first-ever 16-inch MacBook Pro When is it out? Possibly late 2019 How much will it cost? Reports say around $3,000 (about £2,450, AU$4,350)
Tumblr media
Image Credit: Apple MacBook Pro 2019 release date While a 2019 follow-up redesign to the MacBook Pro (as well as the Mac Pro) is still something many Apple fans are looking forward to, there’s only a vague "2019" rumoured release date at this time. That’s hardly a surprise since the last major update came out in July 2018, and Apple just dropped its 9th-gen Intel Core CPU configurations and a 13-inch MacBook Pro July 2019 refresh. Still, it would be curious to see macOS Catalina in an overhauled MacBook Pro. We might need to wait a few more months for a clearer date, or even a confirmation that it will happen, but rumours and reports are piling up. If we’re going by Apple’s previous release dates – with the MacBook Pro 2017 launching in June 2017 and the MacBook Pro 2018 model debuting under the radar in July 2018 – a summer release for the MacBook Pro 2019 made sense. We must consider that, after Intel announced the 10nm Ice Lake chips at CES 2019 for a late 2019 release date, and Apple's macOS Catalina is set for a fall release – not to mention the fact that it just gave the 15-inch model a 9th-gen refresh and the 13-inch model an 8th-gen and Touch Bar refresh. With these two factors in mind, we could see Apple released the MacBook Pro somewhere around then to take advantage of the new tech and operating system update immediately. The possibility of a redesigned MacBook Pro in 2019 is shaky still, but we’re keeping our fingers crossed. We'll be keeping a close eye on Apple's forgone iPhone event in September – if not, we could be looking at an October or even November launch of this 16-inch MacBook Pro.
Tumblr media
Image Credit: TechRadar MacBook Pro 2019 price Just the fact that Apple has refreshed its non-Touch Bar MacBook Pros this July 2019 to boast a higher processor, a better keyboard, and the Touch Bar and Touch ID features says a lot about the direction in which they’re going. The worst part of the 2019 models is the fact that you need to spend thousands of dollars to secure the highest amount of RAM (the 32GB RAM option is only available to the 15-inch models) and the largest SSD combo. Otherwise, you’re stuck with the smallest amount of RAM and a Touch Bar that isn’t even universally compatible. You can also forget about upgrading your 13-inch. The majority of its parts are soldered, so you’ll be forced to bring in a professional and break your warranty. You also may have to buy a third-party docking station since the only ports available to you are the USB-C Thunderbolt 3 ports and a headphone jack. Of course, it’s hard to predict these things, but you’ll probably see more of the same price tags. Currently, the base model of a 13-inch MacBook Pro gets you an 8th-generation Intel Core i5 processor, 8GB of memory, 128GB SSD storage and the integrated Intel Iris Plus Graphics 645. The good news is that this entry-level 13-inch model sticks with the same price tag at $1,299 (£1,299, AU$1,999), despite its recent upgrades. As far as the 15-inch’s 9th gen Intel Core i9 configurations, which currently start at $2,399 (£2,399, A$3,499), we’d like to keep its price while still getting its design refresh. A Chinese technology site has recently reported the 16-inch MacBook Pro to start around $3,000 (about £2,450, AU$4,350). Later, analyst Ming Chi-Kuo reported improvements to the keyboard with a new scissor-switch design. It may seem outlandish to you, but "$2,999" sounds like a fine price for Apple to go with.
Tumblr media
Image Credit: Apple What we want to see from MacBook Pro 2019 Apple hasn’t been getting a lot of love lately, what with the great keyboard debacle of 2016 through 2019, the problematic Touch Bar and the soldered RAM, to name a few. However, we’re still hoping that Apple takes a new lease on life in 2019, and offer its loyal fans some of the things they need and not upsell them on things they don’t need. And, so far, it does look promising. More port variety, please If Apple’s going to insist on sticking with only Thunderbolt 3, at least include a docking station or an adapter for USB, Mini DisplayPort and HDMI without an additional charge. You know, instead of making us shell out more money for a third-party one. Though if we’re really being honest, limiting us to Thunderbolt 3 is inconvenient. MacBook Pro is for professionals who are looking for a seamless workflow. Yes, the Thunderbolt 3 is efficient, powerful and versatile, offering a port for charging as well as super-fast output and data transfers. But many of us are still using devices and accessories that don’t support it. If we’re expected to connect this cable to that adapter to plug into that other port, Apple cannot expect us to be happy about it. And, is it possible to bring back MagSafe charging? Some of us tend to trip over those cables, and it was nice to know that we wouldn’t damage those older models by doing so. Seeing as how Apple launched the latest MacBook Air with just two Thunderbolt 3 ports and no MagSafe charger, chances are slim that Apple is going to diversify the ports on the MacBook Pro 2019. So, what we’re really gunning for now is a docking station or an adapter out of the box. Improved Touch Bar When it comes to the new Touch Bar, it seems that Apple is standing its ground. In fact, every single current MacBook Pro model and configuration now has it. Despite grievances from many users and the fact that many others would be happier without it, Apple has completely phased out the non-Touch Bar MacBook Pros. That’s all fine and dandy, and, we admit, the technology is promising. However, if Apple is going to ask for a few hundred bucks for a new feature, we’d like to maximize its use. So far, Touch Bar compatibility is only limited to a few programs and apps, and we demand to see more added to this list if it’s here to stay. Plus, it would be nice to get it properly working sans the freezes and fat finger issues.
Tumblr media
Image Credit: TechRadar Offer an alternative Again, the Touch Bar shows promise, but it isn’t exactly something that most users need. Not all of us are photographers or filmmakers or artists. Some use just use their MacBook Pros for productivity because of its streamlined interface and user-friendliness, and those folks – the mainstream consumers that Apple is now starting to include in their target market – don’t have a need for a Touch Bar at all. In fact, it might only get in the way and cost them extra for something they’ll hardly use. We’d like to see an option for such users. We’d like Apple to resuscitate the Touch Bar-free models, and price them cheaper than their Touch Bar counterparts. Better keyboard, display and sound Sure, Apple has refined the keyboard so that it’s quieter, but it’s still stiffer than what we’re used to, has less travel than most keyboards and still annoyingly loud. As far as ergonomics, the new keyboard isn’t the best. It’s not that we hate it, but it’s certainly harder to love. Of course, we could get used to its quirks, but we’d rather see a better one, especially since we’re already paying a lot. Better yet, Apple could bring back the old design most of us are partial to, which now seems likely. In addition, MacBook Pro designers should take cues from the iPhone X and phase out the bezel design. That’s a lot of valuable real estate wasted, and rolling out a bezel-free screen will give users a bigger display without having to sacrifice size. Additionally, by going bezel-free, it’ll give the update a fresher, more modern look. Also, a laptop designed to edit 4K media should have a 4K screen option – though, we're hearing that 3K is going to be this laptop's game. And, while Apple’s at it, we’d like to see it reconsider those speakers. We get that Apple is going for a thinner design, but the speakers in the older MacBook Pros are considerably better. With all that technology they come up with, it wouldn’t be that hard to get a premium set of speakers that are louder and have better bass. New Intel processor The MacBook Pro 2018 just got updated with Kaby Lake Refresh and Coffee Lake on the 13-inch, and with the 9th-generation chips on the 15-inch. Unfortunately, Intel has already pushed out Whiskey Lake and Amber Lake processors – putting some of the MacBook Pro configurations behind the curve, if only just barely. Whiskey Lake, after all, only provides a minimal boost to the performance that most users won’t notice. At CES 2019, however, Intel announced Ice Lake, its first round of 10nm Sunny Cove processors for laptops. These chips have already begun shipment, and we’re hoping that Apple already has plans to include these next-gen CPUs in the 16-inch MacBook Pro 2019. If (and, hopefully, when) that happens, we’d like to see Apple offering both 9th-gen and 10th-gen configurations, as well as awarding one of its 13-inch model a 9th-gen Coffee Lake Refresh configuration. See our MacBook 2018 vs MacBook 2019 comparison here. Read the full article
0 notes
moddersinc · 5 years
Text
MSI Gaming X Geforce GTX 1660 TI Review
Tumblr media Tumblr media
When Nvidia’s 10 series of graphics cards hit two years old, we all started to wonder when the next generation would finally be announced. When the Titan V launched, we had all expected the 11 series to follow shortly after that and be based on Nvidia’s Volta architecture. Then, the rumors started to circulate about the next generation being the 20 series. Many of us, myself included thought this was a joke and that they weren’t going to skip over 11-19. Then, that’s exactly what they did when Nvidia announce the all-new RTX 20 series of GPUs with the RTX 2070, 2080 and 2080 ti and more recently, the 2060. With the move to the RTX series with real-time Ray Tracing, many never expected to see the GTX label on a card again. We were recently thrown a curve ball with the announcement of the GTX 1660 ti. The GTX 1660 ti is still based on the same Turning architecture of the 20 series. However, it doesn’t have the benefit of real-time ray tracing, like the 20 series. Pleased just below the RTX 2060, the GTX 1660 TI is said to have performance that rivals that of the GTX 1070, which launched at a much higher price point. Over the last several generations, I’ve personally grown partial to MSI graphics cards. Their coolers are big, beefy and keep the cards cool, even under load. So, when they asked us to take a look at their all new Gaming X 1660 ti, I couldn’t wait to get my hands on it. If the 1660 ti performs as well as they claim, this card could redefine budget gaming. We put the MSI Gaming X 1660 ti through our suite of benchmarks to see how it stacks up to other cards at a similar price point. Specifications and Features MODEL NAME                                              GeForce® GTX 1660 Ti GAMING X 6G GRAPHICS PROCESSING UNIT                    NVIDIA® GeForce® GTX 1660 Ti INTERFACE                                                    PCI Express x16 3.0 CORES                                                            1536 Units CORE CLOCKS                                               Boost: 1875 MHz MEMORY SPEED                                           12 Gbps MEMORY                                                       6GB GDDR6 MEMORY BUS                                               192-bit OUTPUT                                                         DisplayPort x 3 (v1.4) / HDMI 2.0b x 1 HDCP SUPPORT                                            2.2 POWER CONSUMPTION                             130 Watts POWER CONNECTORS                                8-pin x 1 RECOMMENDED PSU                                  450 Watt CARD DIMENSION(MM)                             247 x 127 x 46 mm WEIGHT (CARD / PACKAGE)                       869 g / 1511 g AFTERBURNER OC                                       YES DIRECTX VERSION SUPPORT                      12 API OpenGL VERSION SUPPORT                      4.5 MAXIMUM DISPLAYS                                  4 VR READY                                                      YES G-SYNC™ TECHNOLOGY                             YES DIGITAL MAXIMUM RESOLUTION            7680 x 4320 Key Features Twin Frozr 7 Thermal Design TORX Fan 3.0 - Dispersion fan blade: Steep curved blade accelerating the airflow. - Traditional fan blade: Provides steady airflow to massive heat sink below. Mastery of Aerodynamics: The heatsink is optimized for efficient heat dissipation, keeping your temperatures low and performance high. Zero Frozr technology: Stopping the fan in low-load situations, keeping a noise-free environment. RGB Mystic Light Customize colors and LED effects with exclusive MSI software and synchronize the look & feel with other components. Dragon Center A consolidated platform that offers all the software functionality for your MSI Gaming product. Packaging
Tumblr media
The front of the box has the MSI logo on the top left-hand corner. An image of the graphics card takes up the majority of the front of the box. Across the bottom, there are the Gaming X and Twin Frozr 7 branding to the left. To the right are the Geforce GTX logo and 1660 ti branding.
Tumblr media
The back of the box has a breakdown of the Twin Frozr 7 thermal design. This includes the aerodynamics of the heatsink, as well as a picture showing how the Torx Fans 3.0 work. Below that, they mention a few of the features of Geforce Experience, as well as some of the key features of the Gaming X 1660 ti. The last thing of note are the minimum system requirements for the card. Now, let’s take a close look at the MSI Gaming X 1660 ti.
Tumblr media
The card comes packed in soft foam and in an anti-static bag. Along with the card, there is a quick start guide, a driver disk and a couple of coasters. The Gaming X 1660 ti also came packed with a Luck The Dragon comic book and a thank you note from MSI for purchasing one of their graphics cards.
Tumblr media
A Closer Look at the MSI Gaming X 1660 ti.
Tumblr media
The MSI Gaming X 1660 ti is a PCIe x16, gen 3.0 card. The Gaming X 1660 ti has a boost clock speed of 1875 Mhz. The Gaming X 1660 ti has 6 GB of GDDR6 memory on a 192-bit bus and running at 12 Gbps. The MSI Gaming X GTX 1660 ti is an average sized card measuring 247 mm long, 127 mm tall and 47 mm thick, or about 9.75” x 5.75” x 1.85”. The Gaming X 1660 ti is sporting the Twin Frozr 7 cooler with Torx Fans 3.0 and Zero Frozr technology.  This is the seventh generation if the very popular Twin Frozr Cooler from MSI. MSI claims that with the Twin Frozr 7 cooler, they have “mastered the art of aerodynamics.” Airflow Control Technology forces the flow of air directly onto the heat pipes. At the same time, the heatsink of the Twin Frozr cooler provides a large surface area to help dissipate more heat from the heatsink. The heatsink is made up of three, 6mm copper heat pipes that run through a massive, tight aluminum fin array that make up the heatsink of the Twin Frozr 7 cooler. Like other MSI Gaming series cards, the Gaming X 1660 ti uses a large nickel-plated copper base plate to transfer heat from the GPU to the heat pipes. MSI uses only premium thermal compound on the GPUs that is designed to outlive their competition. They also use a die-cast metal sheet that acts as a heatsink for the memory modules. This die-cast sheet connects directly to the IO bracket. This provides additional protection from bending, along with the back plate. The Torx Fan 3.0 has two distinct types of fan blades. The first being the traditions type of blade designed to push air down steadily to the heatsink. The second is what MSI refers to as a dispersion fan blade. This style of fan blade is slightly curved. This curve allows the fan to accelerate airflow, increasing the effectiveness of the fan. One of the key features of the Twin Frozr cooler is its Zero Frozr technology. First implemented in 2008, the Zero Frozr technology allows your card to be silent when under 60°c. As long as the card is under this temperature, the fans will not spin. Once the card reaches above 60°c, the fans will start to spin. This keeps the card silent while the system is idle, or the card is under a light load. While benchmarking, or under a heavy gaming load, the fans spin up to keep the card cool. The MSI Gaming X 1660 TI has what one might call a traditional IO. Meaning one not designed with virtual reality in mind. The IO consists of a single HDMI 2.0 and three DisplayPort 1.4 ports. The Gaming X 1660 ti also has a brushed aluminum backplate that helps with the rigidity of the card. Both the shroud and the back plate wrap around the end of the card for additional protection of the heatsink. This card also sports a custom PCB with a 4+2 power phases and designed with high-end components. The Gaming X 1660 ti is powered by a single 8-pin power connector. Technically, this tier of card is powered by a single 6-pin. However, the combination of the custom PCB and 8-pin power connector should help with overclocking. Like with other graphics card reviews, we did a tear down of the MSI Gaming X 1660 ti. The MSI Gaming X 1660 ti is based on the Nvidia TU116 GPU. The TU116 GPU has 1536 Cuda Cores, 96 TMUs, 48 ROPS and a max TDP of 120 watts. The memory modules are covered by thermal pads to help dissipate the heat. MSI uses 6 GB Micron GDDR6 memory on their Gaming X 1660 ti, model number MT61K256M32 to be exact. Even the components are designed to look good with the MSI dragon logo on each Ferrite Choke. The Super Ferrite Chokes are labeled with SFC on them. Like on other cards they use their Hi-C capacitors. The Gaming X 1660 ti uses a 4 + 2 Phase PWM Controller. I did notice 2 empty spots for memory. So, maybe there will be an 8 GB variant of this card eventually? RGB Lighting and Software
Tumblr media
Like most components in your system these days, the Gaming X 1660 ti has RGB lighting with customization options through the RGB Mystic Light app. There is RGB lighting on the side of the card, with the MSI Twin Frozr 7 logo is placed. There is also RGB lighting on the top and bottom of each of the Torx Fans. The Mystic Light app has a total of 19 different setting you can set your card to. These settings are listed below. Rainbow Flowing Magic Patrolling Rain Drop Lightning Marquee Meteor Stack Dance Rhythm Whirling Twisting Fade-In Crossing Steady Breathing Flashing Double Flashing One quick side note on the Mystic Light app. I’m not sure if this was an addition, or excluding to the Gaming X Trio 2080 ti. However, when I checked the Mystic Light app with the Gaming X Trio 2080 ti installed, there was a 20th setting for the lighting on the card. This being Laminating which was similar to the Patrolling effect. Dragon Center The MSI Dragon  Center is a desktop application with several functions. It has a function to monitor CPU temperature to the far left of the main screen. There is a Gaming Mode that will optimize your system, monitor, overclock, and Zero Frozr mode with one click.
Tumblr media
The performance section has two different preset, and two profiles you can customize. The preset profiles are Silent and OC. These presets, as well as the custom profiles that set the performance of your system.
Tumblr media
There is a hardware monitoring section that allows you to monitor several different aspects of your system. You can monitor the following: GPU Frequency GPU Memory Frequency GPU Usage GPU Temperature GPU Fan Speed (%) GPU Fan Speed (RPM) You can even monitor fan speed by both percentage and RPM per fan. The Eye Rest section allows you to customize different setting on your monitor. There are five presets in the Eye Rest Section. They are Default, EyeRest, Game, Movie and Customize. They each have different presets for Gama, Level, Brightness, and Contrast. Each of these can be adjusted for the Reds, Greens, and Blues on the monitor.
Tumblr media
The Dragon Center also has a LAN Manager that allows you to set the priority of your internet usage for different applications such as media streaming, file sharing, web browsing or gaming. The Lan Manager has a chart that tells you what applications use the most bandwidth on your system. For me, its Chrome. It even has its own Network Test. The Dragon Center is also where you can enable and disable Zero Frozr Mode. Test System and Testing Procedures
Tumblr media
Test System Intel Core I7 8700k @ stock settings (3.7 GHz Base) Z390 Aorus Pro MSI Gaming X 1660 ti 32 GB of G. Skill Trident Z DDR4 3200 Cas 16 (XMP Profile #1) Intel 512 GB SSD 6 NVMe M.2 SSD (OS) 1 TB Crucial P1 NVMe M.2 SSD (Games and Utilities) Swiftech H320 X2 Prestige 360 mm AIO Cooler 1600 Watt EVGA Super Nova P2 80+ Platinum Power Supply Primochill Praxis Wet Bench Games Battlefield V Deus Ex: Mankind Divided FarCry 5 Final Fantasy XV Ghost Recon: Wildlands Shadow of The Tomb Raider Shadow of War Witcher 3 Synthetic Benchmarks 3DMARK Firestrike Ultra 3DMARK Time Spy Extreme Unigine Superposition VRMark – Orange Room VRMark – Cyan Room VRMark – Blue Room Utilities GPUZ Hardware Monitor MSI Afterburner MSI Dragon Center Mystic Light FurMark All testing was done with both the CPU (8700k) and GPU at their stock settings. For the I7 8700k, it was left at its stock speed of 3.7 GHz. However, this particular chip usually boosts between 4.4 and 4.5 GHz. The one exception was when we tested the overclocking capabilities of the Zotac 2060 AMP. Although ambient temperature does vary. We do our best to keep the ambient temperature around 20°c or 70°f. Each game we tested was run three time each, and the three results were averaged out. Each benchmark was run for 180 seconds, or 3 minutes. For synthetic benchmarks, each was run three time as well. However, instead of averaging out these results, we picked the best overall result. The charts in the gaming section show a comparison between the MSI Gaming X GTX 1660 ti and the Zotac RTX 2060 AMP. All games were tested at their highest presets except for one. The one exception was Battlefield V was tested on the games High Preset on the Gaming X 1660 ti and the Zotac 2060 Amp. This was due to the game running below 30 FPS on one of the test runs. For all testing, I use the highest preset that allows the game to give a result over 30 FPS or what we would consider playable. Synthetic Benchmarks 3DMARK 3DMARK is the go-to benchmark for all enthusiasts. It covers tests for everything from tablets and notebooks to gaming laptops and the most powerful gaming machines. For this review, we tested both the MSI Gaming X 1660 ti and the Zotac RTX 2060 AMP on both the DX11 4k benchmark, Firestrike Ultra and the DX12 4k benchmark, Time Spy Extreme. In both 3DMARK Time Spy Extreme and Firestrike Ultra, the MSI Gaming X 1660 ti landed just behind the RTX 2060 AMP. On Firestrike Ultra, the Gaming X 1660 ti had an overall score of 3368 and a graphics score of 3180. In Time Spy Extreme, the Gaming X 1660 ti achieved an overall score of 2878 and a graphics score of 2801. VR MARK VR Mark consists of three separate tests, each more intensive on your system than the last. These tests are The Orange Room, the Cyan Room, and the Blue Room. The Orange Room test is the least intense and is meant to test a system that meets the minimum hardware requirements for VR Gaming. The Cyan Room shows the user how the use of an API with less overhead can provide the user with a great VR experience, even on less than amazing hardware. The Blue Room test is designed for the latest and greatest hardware. The Blue Room renders at a whopping 5k resolution and is designed to really push your system to the limits for an amazing VR experience. In all three tests, the MSI Gaming X 1660 ti fell just short of the 2060 AMP. But, this was to be expected as its a lower tier card.  In the Orange Room benchmark, The MSI Gaming X 1660 ti achieved a score of 9614. The Cyan Room test was closer, but The MSI Gaming X 1660 ti s scored 6348 and in the Blue room, The MSI Gaming X 1660 ti scored 1985. Superposition Superposition is another GPU intensive benchmark put out by Unigine, the makers of both the very popular Valley and Heaven benchmarks. Superposition is an extreme performance and stability test for graphics cards, power supplies, and cooling systems. We tested the MSI Gaming X 1660 ti in two resolutions in Superposition.  4k optimized and 8k optimized. In the 4k optimized test, the MSI Gaming X 1660 ti score 5046. Last was the 8k optimized test where the  MSI Gaming X 1660 ti scored 2056. Gaming Benchmarks Battlefield V Battlefield V is a first-person shooter EA DICE and published by Electronic Arts. Battlefield V is the latest games in the Battlefield series. Battlefield V takes place during World War 2. It has both a single player and an online portion. For this review, we tested part of Battlefield V single player, War Stories. The section that was tested was the second act of the Nordlys War Story. You play a young woman who is part of the Norwegian resistance whose mission it is to save her mother and help destroy a key component the Germans need to complete their atomic bomb. Battle Fiend 5 was one of the first games to support the new Ray Tracing feature from Nvidia. This game was tested with both DXR on and DXR off. The charts show a comparison between the GTX 1070 TI FTW 2 at its ultra-preset and the Zotac RTX 2060 AMP at the games high preset. The MSI Gaming X 1660 ti did very well in Battlefield V, even in 4k. Keep in mind, Battlefield V was tested on the games High Preset, not ultra. In 1080p, Battlefield V averaged 96 FPS. Even in 1440p, the game stayed above 60 FPS with an average of 63. I was pretty surprised that the game averaged 29 FPS in 4k. If it hadn't been for the dips into the mid-teens, I may have considered it payable at 4k. Deus Ex: Mankind Divided Deus Ex: Mankind Divided is an action role-playing game with first-person shooter and stealth mechanics that released in 2016. Set two years after Human Revolution in 2029, the world is divided between normal humans and those with advanced, controversial artificial organs called augmentations. You take up the role of Adam Jensen, a double agent for the hacker group Juggernaut Collective, who is equipped with the latest and most advanced augmentation technology. This game is beautiful and still very demanding on your system. The section benchmarked was near the beginning of the game, after the tutorial. In Deus Ex: Mankind Divided, The MSI Gaming X 1660 ti did very well in 1080p averaging 82 fps. In 1440p, it fell short of 60 with an average of 49 fps. Under 60, but still very playable. If the settings had been lowered to medium or even high, The MSI Gaming X 1660 ti could handle 1440p in this game. Although the game did average 32 fps in 4k, I still wouldn't consider it playable. The lows were too often Far Cry 5 Far Cry 5 is the latest is the far cry series. It takes place in the fictional Hope County Montana. You play the role of the un-names deputy who’s sent to arrest Joseph Seed, the leader of the dangerous Edens Gate Cult. However, things do not go as planned and you spend the game trapped in Hope County attempting to take out Joseph and the rest of his family as they attempt to take over the entire county. Far Cry 5 was released in 2018. Ubisoft has developed a beautiful open world with amazing visuals. However, the game is very demanding on even the most powerful systems. This game was tested with the in-game benchmark, as well as near the beginning of the game when you first leave the bunker owned by Dutch as you attempt to clear his island of cult members. The MSI Gaming X 1660 ti did very well in Far Cry 5. It averaged 90 FPS in 1080p on max settings and ever stayed over 60 in 1440p with an average of 64. Even in 4k, The MSI Gaming X 1660 ti averaged 30 FPS and with a minimum of 25, I would consider the game playable. Especially if you lowered the settings to medium or high. Final Fantasy XV Fans of the Final Fantasy series waited well over a decade for this game to release. Final Fantasy XV is an open-world action role-playing game. You play as the main protagonist Noctis Lucis Caelum during his journey across the world of Eos. Final Fantasy XV was developed and published by Square Enix as part of the long-running Final Fantasy series that first started on the original NES back in the late 1980s. The section that was benchmarked was the first section near the start of the game, where there was actual combat. In Final Fantasy XV, The MSI Gaming X 1660 ti was very close in both 1080p and 1440p. The game averaged 58 FPS in 1080p and 52 in 1440p. I wouldn't consider playing this game in 4k with The MSI Gaming X 1660 ti since it only averaged 28 FPS with lows into the teens. I'd say Final Fantasy XV is a solid 1080p  game, even with the settings lowered a bit. Gaming Benchmarks Continued Ghost Recon: Wildlands Tom Clancy's Ghost Recon Wildlands is a third-person tactical shooter game. You play as a member of the Delta Company, First Battalion, 5th Special Forces Group, also known as "Ghosts", a fictional elite special operations unit of the United States Army under the Joint Special Operations Command. This game takes place in a modern-day setting and is the first in the Ghost Recon series to feature an open world with 9 different types of terrain. The benchmark was run at the beginning of the first mission in the game. Ghost Recon Wildlands performed well on all resolutions with The MSI Gaming X 1660 ti. In 1080p, it averaged 80 FPS on the games highest preset. In 1440p, it averaged 59 FPS. However, it was still a nice smooth experience. Even in 4k, The MSI Gaming X 1660 ti stayed above 30 with an average of 35 fps and a minimum of 27. Still playable. Shadow of the Tomb Raider Shadow of the Tomb Raider is set to be the third and final game of the rebooted trilogy developed by Eidos Montréal in conjunction with Crystal Dynamics and published by Square Enix. In Shadow of the Tomb Raider, you continue your journey as Lara Croft as she attempts to finish the life work of her father. Her in a journey that takes her from Central America to the hidden city of Paititi as she attempts to stop Trinity in their attempt to gain power. Section benchmarked was near the beginning of the first section that takes place in the hidden city. This was compared to the in-game benchmark which seems to be an accurate representation of the gameplay. In Shadow of the Tomb Raider, The MSI Gaming X 1660 ti is showing itself, again, to be a great 1080p gaming card with an average of 88 fps on the game's ultra preset. In 1440p, The MSI Gaming X 1660 ti averaged a respectable 55 fps in the latest Tomb Raider installment. In 4k, The MSI Gaming X 1660 ti only averaged 27 fps and had a low of 21. Shadow of War Shadow of War is an action role-playing video game developed by Monolith Productions and published by Warner Bros. Interactive Entertainment. It is the sequel to the very successful Shadow of Mordor that released in 2014 and is based on J. R. R. Tolkien's legendarium. The games are set between the events of The Hobbit and The Lord of the Rings in Tolkien's fictional Middle Earth. You play again as Talion, a Ranger of Gondor that was brought back to life with unique abilities after being killed with his entire family as the start of the last game. Monolith Studios has created a beautiful open world with amazing gameplay and visuals. The MSI Gaming X 1660 ti did well on Shadow of War in both 1080p and 1440p. In 1080p, The MSI Gaming X 1660 ti averaged 78 fps and in 1440p, it averaged 54 fps. Even in 4k, The MSI Gaming X 1660 ti averaged above the playable lever with an average of 32 fps. The Witcher 3 The Witcher 3 is an action role-playing game developed and published by CD Projekt. Based on The Witcher series of fantasy novels by Polish author Andrzej Sapkowski. This is the third game in the Witcher Series to date and the best so far. You play as Geralt of Rivia on his quest to save his adopted daughter from the Wild Hunt. At its release in 2015, The Witcher 3 has some of the most beautiful graphics ever seen in a game, as well as some of the most demanding. Even today, almost 4 years later, the Witcher 3 still holds up very well and brings even the most powerful systems to their knees. The game was benchmarked during the hunt for and battle with the Griffin near the start of the main story. I was most surprised with the performance of The MSI Gaming X 1660 ti in the Witcher 3. This is a very demanding game and when the MSI Gaming X 1660 ti only averaged 63 fps in 1080p. I had expected much lower than an average of 53 fps in 1440p. The one that got me was the average of 31fps in 4k on the game's ultra preset. I had expected that to be lower. Overclocking, Noise, and Temperatures
Tumblr media
  For overclocking, we used MSI Afterburner. Like Firestorm from Zotac used in the 2060 AMP review, MSI Afterburner has an OC Scanner feature. However, unlike Firestorm, I was able to get the OC Scanner in MSI Afterburner working. To validate the overclock, we used the GPU Stress Test in FurMark. Being a non-reference card, The Gaming X 1660 ti is already a factory overclocked card with a boost clock of 1875 MHz out of the box. Afterburner has the base clock speed as 1215 MHz. MSI Afterburners OC Scanner was able to get the base clock to 1335 MHz and it boosted to 2040 MHz. With manual overclocking, I increased that to +125 on the core with the card boosting to 2070 MHz. The 2070 MHz was the best boost clock speed the Gaming X 1660 ti achieved.  As for the memory, I was nearly able to max out the slider in MSI Afterburner with the Gaming X 1660 ti. I added an additional +1200 on to the memory. Anything past that and the benchmark would crash. During the gaming benchmarks, the MSI Gaming X 1660 ti maxed out 61°c. However, even while running FurMark to validate the GPU overclock, the card never saw 70°c, maxing out at 69°c.   For noise testing, I used the Sound Meter Android app by ABC Apps found in the Google Play app store. Noise testing wasn’t done with an actual decibel meter. It may not be the best solution, but it works either way. The app gives you a min, max, and average for the noise level in decibels. The noise levels were tested with the fans at 25%, 50%, and 100%.  At 100%, the max decibel level was 51.3 and the average was 46.6/ At 50%, the max was 38.1 and the average was 32.5. Finally, at 25%, the max decibel level was 36.5 and the average was 30.9.
Tumblr media
  Final Thoughts and Conclusion
Tumblr media
The MSI Gaming X GTX 1660 ti offered a far better gaming experience than I had expected when I first heard of its release. I had originally expected it to perform slightly above the GTX 1050 ti. But, I was wrong. The Gaming X 1660 ti performed just below the RTX 2060 AMP we recently reviewed. This puts the MSI Gaming X 1660 ti on par with the GTX 1070. However, the Gaming X 1660 ti launched at a lower price of 1070. The card design is, like other Gaming X cards from MSI, beautiful. I love the use of Neutral colors. The gray and black color scheme will allow the card to fit well into most builds and the brushed aluminum backplate looks great. Its good to see most companies doing this now. Although I'm not a fan of RGB lighting, the Gaming X 1660 ti doe RGB right. The RGB lighting on the Gaming X 1660 ti is subtle and not overdone. With the Mystic Light app, you can customize it however you want.  You can even disable the lighting altogether. The MSI Gaming X 1660 ti has proven to be a beast of a 1080p gaming card. All eight games we tested averaged over 60 FPS, some even into the 80s and above. A couple of the game including Far Cry 5 and Battlefield V even averaged over 60 FPS in 1440p. Far Cry 5 was even on the games Ultra Preset. So, by lowering the details on many modern titles, the MSI Gaming X 1660 ti could easily handle many modern games in 1440p. Some games were playable in 4k, such as The Witcher 3 which averaged 31 FPS in 4k. However, the MSI Gaming X 1660 ti is not a 4k gaming card, nor was it intended to be. If you're looking to build a system on a tight budget, the MSI Gaming X GTX 1660 ti is a great card to consider. At the time of this review, we found the MSI Gaming X 1660 ti on Amazon for about $360. However, that's a fair amount over the $309.99 msrpMSRP has set on this card. So, look around and you can find it for a better price I'm sure.     amzn_assoc_tracking_id = "dewaynecarel-20"; amzn_assoc_ad_mode = "manual"; amzn_assoc_ad_type = "smart"; amzn_assoc_marketplace = "amazon"; amzn_assoc_region = "US"; amzn_assoc_design = "enhanced_links"; amzn_assoc_asins = "B07N825Y1L"; amzn_assoc_placement = "adunit"; amzn_assoc_linkid = "c560a2637b6c8b5e007383ecbc4f63c1"; Read the full article
0 notes
Text
Gaming on Windows is just better.
Reasons windows is better I'll, save you guys the trouble there aren't any actually yeah. I agree. Are you guys kidding me? The vast majority of the world runs Windows on the desktop and believe it or not. There are some pretty darn good reasons for it, so guys we compiled the top 10 of them from our community to share with you in this video thanks LastPass for sponsoring a portion of this video. They relieve the burden of trying to remember all your passwords for every website. Let LastPass fill in your passwords, for you learn more at the end of the video or at the link below [, Music ]. First up and this one's a shocker gaming, our community spoke, and we agree. Gaming on Windows is just better. 
Not only are there tons of current games for the Windows PC platform like literally thousands of them, but accessing them and keeping them up to date is much simpler than it used to be thanks to online marketplaces like Steam, origin, you play, and yes, even the epic Game store and Windows gaming has far more going for it than just the current library. Recent progress towards integration with Microsoft's Xbox ecosystem has brought cross-platform play to some titles and even cross-platform purchases, and on the subject of compatibility. Well, there's the back catalog of games, which numbers in the tens of thousands with a shocking number of old games still being playable on modern hardware. I fired up 1602 80, a game from almost 1602 80 on my Windows, 10 PC with a Titan RT X on it with minimal tinkering required. 
That'S crazy! So we're actually working on a collab with good old games. Right now to show this off make sure your sub, so you don't miss it on the subject of tinkering Windows games, particularly the older ones, allow for a ton of it with large communities that have built everything from their own servers from multiplayer to mods that alter Visuals or gameplay elements and even mods that change the genre of the original title fun fact for you, young kids out there dota used to be a custom map in Warcraft 3. Finally, there's the advantage that comes naturally with being the incumbent gaming platform support wan na try out the hottest new peripherals like brand new graphics cards, VR headsets, haptic feedback, vests odds are excellent, that the Windows software is going to be much more polished than what's available. For other platforms, that is, if anything exists, for them at all, RTX real-time ray tracing on Mac. 
Please is actually a common one for users of every platform and it's that it just works or because I don't feel like something new like Apple Microsoft has made it OS that, for the most part, works as intended. Out-Of-The-Box, no real extra effort is needed. Thanks to Auto magical third-party driver installs through Windows Update when you get into the weeds with obscure devices, hardware compatibility on the platform does have its issues, but for the average user it is much better than it used to be, and so is the general intuitiveness of Using it I mean I still remember when they introduced the documents and pictures folder. Comm 4 is the toolbox. The registry editor, if used responsibly, is just the tip of the iceberg when it comes to optimizing. The windows experience task manager got some big upgrades with Windows 8 and now makes it so simple to monitor CPU RAM network and even GPU usage. 
So anyone can do it, but if you want to go even further, this rabbit hole. 2 has pretty much no bottom resource monitor, gives you a much more granular. Look the information from cast manager, making it easy to identify processes that are sending large amounts of network data out or causing your disk to churn and slow down the rest of your system. Task scheduler is a crazy, powerful utility that lets. You have Windows, automate tasks for you. It can open and close programs for you when you log in and out it can send emails when tasks complete and you can even post to Twitter and Facebook using the window. Scheduler and power toys are back, so these are actually Microsoft provided tools that enthusiasts can use to add or enhance features.
 I was a huge fan of sync TOI back in the day and this new window management one for Windows, 10 looks sick. 5. Is the support base want to learn how to do some of the stuff? You'Ve talked about well, with 78 % of the worldwide desktop OS market share. If it exists, someone has probably done it so, like you want to become the new macro king. Well, there are tutorials on how to do that, need to troubleshoot a weird error between the official support from Microsoft, for both current and legacy windows and the thousands of enthusiasts on forums around the world. The odds of finding someone to help. You are pretty good. One. Great resource is actually our forum linked below, where our community is ready, willing and able to help feel free to check that out after the video 6 is productivity. Even Apple had to acknowledge. 
Windows is strength when it comes to buckling down and just getting some work done, whether you're trading stocks, writing reports, tracking financials, making super cool, PowerPoint, slides or making YouTube videos like us windows probably supports the software and the hardware that you need to get it done. Microsoft'S Office suite is incredibly powerful and works best on Windows if you want to do 3d or CAD work. Most of the industry-standard software is on Windows and, let's not forget the plethora of one-off and highly specialized programs needed for scientific study, engineering and many other industries. Now I wasn't sure where to put this little bit so we're gon na chuck it in productivity, shortcut keys, so many shortcut keys, classic control-alt-delete for when things go wrong windows and one two three and four to launch the corresponding app on your taskbar go ahead and 
Try it it's really cool and if you like bad one, you can grab the other new power toy that lets. You hold the Windows key to see all the shortcut keys for your active programs. Oh productivity bliss awaits my friends. Seven is OS unity with some notable exceptions. Windows hasn't changed too drastically over the years. So if you went straight from Windows, XP to Windows, 10 you'd probably find your way around it sooner rather than later, and if you're a technician. This can be really nice because it's not uncommon to find yourself working on a different version from one hour to the next. It'S a totally different experience compared to Linux, which has I don't know, I stopped counting after 30, let's just say a lot of different distros or versions that are designed for a multitude of different tasks or specialized use cases. 
There are mainstream optimized distros out there, but if you don't consult the internet beforehand as a newcomer, it can get really confusing and thing is even if you do consult the internet. People might not agree on which flavor of the month is vastly women do bunt to stop being cool. Eight was a bit of a surprise to me, but it came up a lot. So maybe I just take the taskbar and file explorer for granted. The modern taskbar is a great tool for maintaining a clean and organized desktop, giving you quick access to frequently used programs and offering up a quick preview of all of your active windows as for File Explorer. Well, it's got its issues. The search is pretty slow. The up, folder navigation is done. Sometimes, documents should go to sequel and slash users, slash your username, not this PC etc. 
But it's got wide support for thumbnail, previews lots of useful information readily available, and it requires no keyboard shortcut to cut paste. Sometimes you don't have to be great just better than your competitor. 9 is reliability with good Hardware. The days of daily blue screens are long. Gone crashes do still exist, but for years now I've experienced long periods of smooth and stable performance. Microsoft does have some work cut out for them to make their automatic updates mover in the regarde, but they at least seem to be aware of the problem. At this point, bringing us to ten finally sort of related to gaming to compatibility, got an old program from the Windows XP days. Well, there's a decent chance that, with some trial and error, you will be able to get it to run even in the latest. Builds of Windows 10. 
There are just so many specific use programs that have been written over the last couple of decades and losing access to them because of an OS update could be devastating for some people. Compatibility mode actually works more often than you'd think and when it doesn't, some quick googling will often bring up a solution, and the cool thing is that goes. Both ways got a computer that mom bought 10 years ago, but still wants to use. Well, there's a solid chance that Windows and most programs that run on it will still work on that, even if not very well. Our Skull trail system from 2008 was actually a great example of this no driver issues and, aside from a couple of games that refused to launch because of missing CPU instructions, our issues were related to performance rather than to compatibility, so guys go check out that video. 
If you haven't already now one of the tools we love using on Windows comes, of course, from our sponsor for this portion of today's video LastPass LastPass relieves the troubles of remembering your passwords and reduces the anxiety about getting locked out of your accounts and then waiting For reset password emails, you won't need to write down, remember or reset passwords anymore with LastPass LastPass allows you to keep track of an unlimited number of passwords and not just passwords. Even just things like you know, Wi-Fi codes or just things you want to remember and store somewhere safe, and it doesn't only work for desktop it even works on mobile sites and apps for both iOS and Android. When you open an app or a site, LastPass will fill in your username and password making logging in easy, so click the link below to find out more about LastPass. So, thanks for watching guys hope you enjoyed this video see you 
0 notes
tech-battery · 4 years
Text
Best Intel Z490 motherboards
The best Intel Z490 motherboard is a must If you’ve made the decision to jump onto the Intel 10th Gen Comet Lake train and want to get the most out of your new CPU. Intel's Z490 boards are equipped with the new LGA 1200 socket, and even the more affordable B460 options feature improved connectivity, networking, power delivery components and (take this with a grain of salt) future 11th Gen Rocket Lake compatibility.
So while you need a whole new motherboard platform for Intel's new chips, there's a strong chance it will last for a few CPU generations to come. Especially given Intel's propensity for 14nm silicon… While 10th Generation CPUs may not have shaken the foundations of the tech world, they are the fastest gaming chips around. If your current gaming machine is several years old then upgrading right now will get you a faster system in every way. Faster cores and more of them, faster networking, faster and higher capacity memory, faster storage, and faster USB amongst other things. Upgrading from the 7th, 8th or 9th to 10th generation might not be a big leap, but 2nd to 10th Gen sure is!
The K-series CPUs have 125W TDPs, though the higher core count models will actually pull a lot more power than that. This means even budget Z490 motherboards are built with robust power delivery circuitry. One of the areas the Intel Z490 chipset might be seen as lacking in is official PCIe 4.0 support, something which is present on AMD’s competing X570 platform. Some manufacturers, such as MSI, are claiming PCIe 4.0 support on their Z490 motherboards, even though it is not supported by 10th generation CPUs and hence cannot be validated. Intel won’t confirm any details one way or the other about its future products, so for now PCIe 4.0 support on Z490 seems sketchy at best and shouldn’t factor into a purchase decision at this point in time.
While the specs have improved, we’re still not impressed with the upward trend in pricing. Just a year or so ago we were shocked to see $1,000 high-end desktop motherboards for Threadripper or the X-series chips, but now manufacturers are clearly happy to push the envelope with most of them offering boards getting on for $800 even though Z490 is still a mainstream chipset. We’re not dealing with quad channel memory or a ton of PCIe lanes, but these monstrous ~$800 motherboards have everything else thrown at them. They might seem crazy, but people are buying them… and if you want the best Z490 motherboard experience that's what you're aiming for.
1. Asus ROG Maximus XII Extreme
The best Z490 motherboard, for the price of a full gaming PC...
Size : E-ATX | Memory support: 4x DIMM, up to 128GB, DDR4-4700 (OC) | Expansion slots: 2x PCIe 3.0 x16 (or x8/x8), 1x PCIe 3.0 x4 | Storage: 2x M.2, 2x M.2 (DIMM.2 board), 8x SATA 6Gbps | Rear USB: 10x USB 3.2, 2x USB 2.0 | Video ports: 2x Thunderbolt 3 ports on extension card (DP1.4) | Network : 1x 10Gb Marvell ethernet, 1x 2.5Gb Intel ethernet, Intel WiFi 6 wireless
OC performance
Stunning bundle
Incredible build quality
I cannot pretend that this makes any sense from a rational standpoint, y'know, one where money actually exists and comes in finite amounts. But if we're talking about the best Z490 motherboard, for me, this is it. The Asus ROG boards have always been high-end offerings—though there are lower-spec options here in this guide too—but I do feel that MSI has been the enabler. Its Godlike boards have really cemented the idea of the ultra-enthusiast motherboard which cost, if you'll pardon the redacted expletive, f*** you money. After the arbitrarily priced £777 X570 Godlike, Asus has obviously decided it has carte blanche to charge whatever it likes.
But if you want a features list as long as the Dead Sea Scrolls then you're going to have to pay for it. What I will say is that features list isn't just stuffed with makeweight extras, things just stuffed into the package to fill it out and try and justify that price. Everything you get with the Maximus XII Extreme is useful for a super high-end Comet Lake build, from the extra fan controller, DIMM.2 storage expansion, to the Thunderbolt card. But I would like to make special mention of two of my favourite things in the package: the multi-head screwdriver and the braided SATA cables. Mmm.
You also get great performance. On the whole, the MSI Z490 Godlike does just about have the edge in straight speeds, but if I'm buying a Core i9 10900K I want to overclock the hell out of it, and I got a result I would be happy running at consistently from the Asus, where the Godlike just got a little too toasty for my taste at 5.3GHz all-core. Both are great high-end boards, but I'm going to side with the Asus for now.
2. MSI MEG Z490 Godlike
The best Z490 motherboard if you want to sell that other kidney too
Size: E-ATX | Memory support: 4x DIMM, up to 128GB, DDR4-5000 (OC) | Expansion slots: 3x PCIe 3.0 x16 (x16/x0/x4 or x8/x8/x4), 1x PCIe 3.0 x1 | Storage : 3x M.2, 6x SATA 6Gbps, 2x M.2 Xpander-Z | Rear USB: 8x USB 3.2, 2x USB 2.0 | Video ports: 2x Thunderbolt 3 | Network : 1x 10Gb Aquantia, 1x 2.5Gb Realtek LAN, Intel WiFi 6 wireless
Top CPU performance
Great bundle
PCIe 4.0 support. Kinda
The latest MSI Godlike is a fantastic high-end Z490 motherboard. I might still struggle with the naming scheme, and the similarly offensive price tag, but it's tough to argue with the technology that it offers. The power delivery system is something that's going to get seriously stretched by the Comet Lake architecture, and with 16 phases and 90A Smart Power Stage the Godlike has been designed to maintain a steady supply no matter what the Core i9 10900K wants to draw through it.
And it does an incredible job with the power you give it too. The performance I got out of the Godlike put it a shade ahead of the Maximus XII Extreme in gaming terms, though only by a couple of frames per second at best, but it did stretch that lead a touch when it came to video encoding. At stock speeds it's a touch cooler too, though that didn't translate into the overclocked performance as it peaked at 99°C when running at the 5.3GHz all-core peak I hit with the i9. Still, it wasn't throttling even so, but I wouldn't be happy running at that level consistently.
But the Godlike does offer PCIe 4.0 support. In a fashion. Just not with Comet Lake. The M.2 Xpander-Z will support up to PCI 4.0 bandwidth, though there is a sticker on it which adds the caveat that transfer speed might be limited by the chipset and processor. There is the promise of Rocket Lake bringing PCIe 4.0 support to the Intel ecosystem, and the Z490 is supposedly compatible, so there's a chance a Gen4 NVMe SSD might be able to take advantage down the line.
The overall bundle, build quality, and OC performance have me siding with the ROG board in the head-to-head, but it's a close run thing, and you could also make a convincing argument for the MSI Godlike being the best Z490 motherboard around.
MSI MPG Z490 Gaming Carbon WiFi
A vaguely affordable Z490 motherboard with competitive performance
Size : ATX | Memory support: 4x DIMM, up to 128GB, DDR4-4800 (OC) | Expansion slots: 3x PCIe 3.0 (x16/x0/x4 or x8/x8/x4), 2x PCIe 3.0 x1 | Storage: 2x M.2, 6x SATA 6Gbps | Rear USB: 5x USB 3.2, 2x USB 2.0 | Video ports: 1x DisplayPort, 1x HDMI | Network : 1x 2.5Gb LAN, Intel WiFi 6 wireless
More reasonably priced Z490
Still competitive performance
The sparse back panel and missing OLED displays will tell you we're back into normal motherboard territory again. The rarified air of the ultra-enthusiast pairing at the top of the test might make one giddy, but the Z490 Gaming Carbon will bring us back down to earth without a bump. Sure, you're never going to get the same level of luxury feature list as you'll find with either the Godlike or Maximus XII boards, but when it comes to the nuts and bolts of pure performance it's right up there.
Where it matters, in the gaming performance stakes, there's practically nothing between any of the Z490 boards we've tested, and it's only ever a little behind when it comes to the actual CPU performance in productivity apps. When it comes to overclocking, however, the MPG Z490 Gaming Carbon WiFi inevitably can't compare to the big boys with our 10900K running at its peak.
The power componentry and cooling isn't enough to stop the thirsty CPU from throttling when it's pushed to its 5.3GHz all-core maximum. But, while that might mean it's not the board you'd choose for an overclocked i9 machine, that's a tiny niche of gamers, and for either i5 or i7 CPUs the MSI Gaming Carbon is still a quality home for your Comet Lake CPU.
4. MSI MEG Z490 Ace
MSI's Z490 ace in the hole.
Size : ATX | Memory support : 4x DIMM, up to 128GB, DDR4-4800 (OC) | Expansion slots: 3x PCIe 3.0 (x16/x0/x4 or x8/x8/x4), 2x PCIe 3.0 x1 | Storage: 3x M.2, 6x SATA 6Gbps | Rear USB: 6x USB 3.2, 2x USB 2.0 | Video ports: N/A | Network : 1x Realtek 2.5Gb, 1x Intel 1GB LAN, Intel WiFi 6 wireless
Subtle looks
Strong VRM design
High-end performance
Another top board from MSI, the Z490 Ace looks great with its subtle gold highlights and metallic elements. It’s good to have the primary M.2 slot above the GPU rather than cooking away underneath it. Other little things like the white post code display, fast booting and excellent fan control makes the Ace a really refined and polished product. Even the M.2 installation is easy, with no need to remove a heap of screws and half of the entire heatsink.
MSI has knocked it out of the park with the Ace’s strong VRM design. Dual 8-pin power connectors feed a 16+1 phase 90A design that’s cooled by big chunks of heatsink. It’s got a small fan that only spins up when it’s required. There’s also rear mosfet baseplates, which all told means you can overclock to your heart’s content. Does any Z490 motherboard have a genuinely better VRM?
Its excellent finish, well laid out BIOS, feature set, top class VRM design, performance efficiency and capable overclocking means all boxes are ticked, as it should be if you’re going to drop nearly $400 on a motherboard. Ace by name, Ace by nature.
5. Gigabyte Z490 Aorus Master
Gigabyte's best Z490 motherboard is a luxurious Aorus.
Size : ATX | Memory support: 4x DIMM, up to 128GB, DDR4-5000 (OC) | Expansion slots: 3x PCIe 3.0 (x16/x0/x4 or x8/x8/x4) | Storage : 3x M.2, 6x SATA 6Gbps | Rear USB: 6x USB 3.2, 4x USB 2.0 | Video ports: 1x HDMI | Network : Intel 2.5Gb LAN, Intel WiFi 6 wireless
Built to last
Huge number of USB on backplate
Quality audio
The first thing we noticed when taking the Z490 Aorus Master out of its box was its weight. This is a board gives the impression that it’s built to last, and at $390 we’d certainly hope it does! For the money, it’s clear the Master is a lot of motherboard. Interestingly, where some Aorus boards tend to light up like a laser light show, the Master really dials down the RGB with just a little bit of subtle lighting over the rear I/O and the chipset heatsink. The overall look is very understated, modern and mature.
The Master has no less than 10 USB ports on the back panel, which is impressive, although it’s also the only board to have a single LAN port, though it is a quality Intel 2.5G controller. It does have Wi-Fi 6, but if dual LAN is important to you, you’ll have to look elsewhere. We’re usually impressed with Gigabyte’s audio and that’s also the case here, with an ESS Sabre DAC and quality component choices. You also get the fairly standard 6 SATA ports and triple M.2 slots which distinctively feature thermal padding for both sides of the drive. A nice touch. Note that Gigabyte is also touting its PCIe 4.0 support, though as we said in the intro, it’s not something that should factor into a purchase decision.
Perhaps the only drawback, and honestly it’s hard to even call it that when the differences are so small, was that performance tended to trail the pack in many cases. It’s nothing to worry about, but when you’re comparing motherboards that are all genuinely strong contenders, you have to look for some differentiation, and for the Aorus, this is it. Don't necessarily let that put you off, it’s still a board we’d be proud to own. Heck if it was $50 cheaper, it might have been a winner.
6. ASRock Z490 Taichi
ASRock has moved up to the premium tier.
Size : ATX | Memory support: 4x DIMM, up to 128GB, DDR4-4666 (OC) | Expansion slots: 3x PCIe 3.0 (x16/x0/x4 or x8/x8/x4), 2x PCIe 3.0 x1 | Storage : 3x M.2, 6x SATA 6Gbps | Rear USB: 8x USB 3.2 | Video ports: 1x DisplayPort, 1x HDMI | Network : Dragon 2.5Gb, 1x Intel 1Gb LAN, Intel WiFi 6 wireless
Great-looking board
Decent overclocker
The Z490 Taichi doesn’t deviate too much from its recent siblings in terms of its looks. I mean, why would you though with its lovely retro, almost analogue theme. It looks terrific. You get a nice splash of RGB lighting too of course.
The Taichi has a beefed up 15-phase VRM system fed by dual 8-pin power connectors designed to cope with the demands of 10th Gen processors. Each choke is rated for 60A, which is less than the 90A the MSI, Asus, and Gigabyte offer. The Taichi incorporates no less than three small fans into the heatsink, though thankfully they are all but silent and couldn’t be heard above the sound of our AIO CPU cooler and pump. When the board is presented with a light load, they don’t spin at all.
Asrock is also touting its PCIe 4.0 readiness, though again that’s jumping the gun a bit with Intel not commenting on future compatibility or support at this time. The performance of the Taichi was interesting. It scored very well in bandwidth sensitive applications, indicating that it sets aggressive memory sub timings. Our DDR4-4000 C16 test required a voltage bump in order to achieve stability too. The board didn’t miss a beat when pushing our 10900K to 5.2GHz on all cores, 5.3 is possible, though the heat from the CPU made it unstable, not exactly a fault of the board.
The ASRock Z490 ticks almost all the boxes. It’s got an intuitive, easy to navigate BIOS and a typically strong Taichi feature set. Its VRM isn’t as strong as the other boards in the test and it’s perhaps a BIOS update away from feeling really polished. Saying that is being tough on it, though; the Taichi is a strong offering.
7. Asus ROG Maximus XII Hero Wi-Fi
Typical ROG refinement for a typically high price
Size : ATX | Memory support: 4x DIMM, up to 128GB, DDR4-4800 (OC) | Expansion slots: 3x PCIe 3.0 (x16/x0/x4 or x8/x8/x4), 3x PCIe x1 | Storage : 3x M.2, 6x SATA 6Gbps | Rear USB: 8x USB 3.2, 2x USB 2.0 | Video ports: 1x HDMI | Network : 1x Marvell 5Gb, 1x Intel 1Gb LAN, Intel WiFi 6 wireless
Fantastic Asus BIOS
High build-quality
Good overclocking
It says a lot about Z490 motherboard pricing when the Hero – usually a value offering in the Asus Maximus range – sells for an eye watering $400. When compared to the Maximus XII Extreme though, the Hero feels like a bargain. The Hero comes with a 14+2 phase VRM design with each stage rated for 60A. It’s not inferior so to speak, but the Max XII Extreme, MSI, and Gigabyte boards have a better VRM spec along with dual 8-pin power vs the 8+4 pin of the Hero. There’s a bundled small cooling fan that you should install if you wish to push the board hard.
Asus really goes the extra mile when it comes to BIOS features, though perhaps it can be overwhelming for novice users. If you want to tweak, and then tweak some more, then Asus has you covered. The Hero was effortlessly able to overclock our 10900K to 5.2GHz and our memory to DDR4-4000. Try as we might, 5.3 GHz is a step too far for our cooling. If you’ve got top shelf cooling, then the Hero will handle it. Our memory took exactly one try to get to DDR4-4000 16-16-16 which is a nice performance sweet spot. Don’t forget that Asus offers the Maximus XII Apex if OC is your main game.
We can’t escape from the fact that the Maximus XII Hero is $400. Economic woes notwithstanding, that’s a lot of money, particularly when the equally strong MSI and Gigabyte Z490 motherboards are a bit cheaper. Having said that, Asus rarely missteps with its ROG boards and the company has rightly earned itself a loyal band of followers who will hardly consider another board. If you choose the Hero you get the advantage of impressive 5G networking, excellent build quality and you’ll have a capable and refined motherboard. Its VRM isn’t class leading though, but that really only applies to extreme overclockers. We wish is was a few dollars cheaper but if you do go for the Maximus XII Hero, it will do your system justice.
0 notes
grassroutes · 5 years
Text
One Mix 3S Yoga: Not the First Netbook, But Probably The Best
Our verdict of the One-Netbook One Mix 3S: If you absolutely need maximum portability but want all the power you can get, this is a good buy, but the same money can buy you a much better laptop.710
It’s official: netbooks are back. It started with a few smaller computers here and there, but it wasn’t long before the trickle became a torrent. Things have changed though. The netbooks of 10 years ago were tiny and portable, yes, but they were also massively underpowered.
On the other hand, the netbooks we’re seeing released now can be as powerful as any other laptop, just scaled down in size. The One Mix 3S is an example of just how mighty these pint-sized powerhouses can be. Whether you actually need one for yourself is another matter entirely.
youtube
One Mix 3S Hardware Specifications
The One Mix 3S comes in a few different variations. The specs of our review unit, which appears to be the base model, are reflected below. You can also get it with major power upgrades like an i7 processor, though obviously, it will cost you more.
CPU: Intel Core M3-8100Y (1.1GHz to 3.4GHz)
GPU: Intel UHD Graphics 615
RAM: 16GB
Storage: 512 NVMe SSD
Screen: 8.4″, 2560×1600 55Hz multi-touch IPS display @ 358ppi
Battery: 8,600 mAh 3.7V
Dimensions: 204mm x 129mm x 14.9mm or 8.03 x 5.07 x 0.58 inches
Wireless: Dual-band 802.11ac / Bluetooth 4.0
Ports: USB-C, USB 3.0, Micro HDMI, Headset, TF card slot
Other: Fingerprint scanner, optional stylus with 4,096 pressure levels
Price: under $900 at GeekBuying.com
The One Mix 3S runs Windows 10 Home. We’d have preferred to see Windows 10 Pro, but this helps keep the initial price down, and you can always upgrade easy enough from the Windows Store.
Body and Design
 As you can likely tell by either reading the size specs above or just looking at the photos, this is a tiny computer. Because of the overall small dimensions, it looks a little on the chunky side, but compared to my MacBook Pro, it’s just as thin.
Our review unit came in a sleek black finish that was stunning out of the box. There’s a reason I say “was” instead of “is.” This is a bit of a fingerprint magnet. Within a few minutes of handling the laptop, it didn’t look nearly as sleek as it had out of the box. A quick wipe with a microfibre cloth had it back to its original look.
The netbooks of old were often flimsy, but this is anything but. Like the One Mix 2S, this is machined from solid aluminum, giving it a very solid feel. This extends to the netbook as a whole, especially the hinges for the screen, which have been the downfall of many a laptop.
Given the small size, this is among the most portable computers you’ll find. It won’t quite fit in your pocket as advertised, but it will certainly fit in a purse and can easily fit into all but the most jam-packed backpacks.
Display and Brightness
The screen is almost overkill, with a resolution of 2560 x 1600 packed into an 8.4-inch screen size. That’s close to the resolution of a Surface Pro, but in a much smaller screen size. This results in an ultra-crisp pixel density of 358 pixels per inch.
The color isn’t as good as the overall display, however. Colors are a little muted and washed out when compared to the displays on other gadgets I had handy. Granted I had to look, but this was definitely noticeable.
One issue I’m not sure will be so obvious, but did give me pause during testing, is the refresh rate. Instead of the usual 60Hz refresh rate, the One Mix 3S runs at 55Hz. This caused a noticeable strobing effect when I was shooting photos and video.
Even after I was done shooting, I could swear I noticed a vague strobing effect when looking at the screen. This could have been in my head, but it’s worth keeping in mind if you’re sensitive to this sort of thing.
Is the Tiny Keyboard Usable?
As with any computer like this, the keyboard is small. There’s no getting around that. Because of its small size, the keyboard layout has changed. This is the actual problem with these netbooks, rather than the individual key size.
If a company somehow manages to fit a standard keyboard layout into a netbook-sized keyboard, they’ll deserve some sort of award. Until then, it’s a matter of figuring out which sacrifices you can comfortably live with and which are simply untenable.
With the One Mix 3S, there are a few specific problems. The first and most common is the tiny space bar. For frequent users of the tab key, it’s even worse, as this key has been relocated to the function keys. The quote keys are down to the right of the space bar as well.
One plus of the tiny keyboard is that it is backlit. Strangely though, the backlight is aggressive about turning off. This is likely in the name of saving battery life, but if you frequently type in a dark room, this will get annoying quickly.
What’s the Point of the Pointer?
While older netbooks usually still included a touchpad, these are hard to find on modern netbooks. In the case of the One Mix 3S, it uses an infrared induction mouse. We’ve seen these before, and while not ideal, they are usable.
The problem in the case of the One Mix 3S is the location. On older One Mix models and similar netbook-style computers like the Chuwi MiniBook 8″, the “mouse” is located in the middle of the spacebar, similar to where a ThinkPad-style trackpoint would be. Here it’s below the space bar and in between the left and right mouse buttons, making it unwieldy to use.
As I and others have said before, an actual rubber trackpoint would be both smaller and easier to use. This could simply be familiarity, but the induction mouse would be maddening if it was your only option for navigating the OS.
Fortunately, the screen on the One Mix 3S is a touchscreen, which alleviates almost all of the mouse-related issues. Even better, our review unit shipped with a stylus, which made selecting tiny user interface elements possible.
The touchscreen and stylus allow the One Mix 3S to pull double duty as a Windows 10 tablet. It’s not going to give Microsoft’s Surface Pro devices a run for their money, but it works quite well in tablet mode.
Performance
As reading the specs might have clued you in, the One Mix 3S isn’t a toy. This is a serious computer that just happens to be in an ultra-portable form factor. While I noticed limitations from the screen size and keyboard, I never felt like the hardware was slowing me down.
Thermal issues can pop up on smaller form factor computers like this, so you may be glad to know the fans are pretty aggressive. They don’t get overly loud, but they tend to kick in as soon as the computer starts to work at all.
I’m not quite sure why, but when I tried running benchmarks, they didn’t line up with the performance I noticed. Geekbench gave me scores of 654 for single-core performance and 1263 for multi-core performance, much slower than scores I’ve seen on less powerful hardware. The computer certainly felt more responsive than these scores would indicate.
Another area where the One Mix 3S excelled was Wi-Fi speed. Unlike the One Mix 2S, which was hampered by problematic Wi-Fi, I never noticed any issues with the One Mix 3S.
Battery Life
The One Mix 3S sports an 8,600mAh battery, bigger than the 6,500mAh battery in the One Mix 2S. Of course, this model has a bigger screen, but battery life is still improved overall.
I averaged around 6 hours of life per charge during my testing, though this varied based on how much I was pushing the hardware. The good news is that the One Mix 3S doesn’t seem to sip much power when the screen is shut, so you don’t need to worry about shutting it down to save power.
Like other modern netbooks, the One Mix 3S uses USB-C for charging. It only has a single USB-C port, which can be problematic, but like other similar computers, you can always get around this by using a hub.
How long it takes to charge will vary based on the charger you’re using. With the included 12V, 2.5A charger, charge time was around 2 to 3 hours.
Should You Buy the One Mix 3S?
With some other modern netbooks, I’ve noticed that while the specs seem powerful enough, actually using them still feels slow. That isn’t the case with the One Mix 3S. This feels like a computer that, with the exception of editing video, I’d have no problem using on a day to day basis. That is assuming that I’d eventually get used to the keyboard.
That said, if you’re just looking for a moderately powerful laptop, the One Mix 3S shouldn’t be at the top of the list. If portability is your first concern, with power coming in a close second, then the One Mix 3S is absolutely worth looking at. The sheer amount of RAM alone will be very useful when it comes to everyday computing tasks.
If you’re not in a rush, you might just want to wait a few months and see what comes down the pike next. We’re in a great time for netbooks, with each new model seemingly better than the last. If that trend continues, I can’t wait to see what the future has in store.
The One Mix 3S, configured as we reviewed it, will cost you between just under $900 currently. That certainly isn’t cheap, and as mentioned above, that will buy you plenty of power in a standard-sized laptop. That said, if you crave portability but can’t sacrifice the power, you could certainly do a lot worse.
Enter the Competition!
One-Netbook One Mix 3S Giveaway
Read the full article: One Mix 3S Yoga: Not the First Netbook, But Probably The Best
One Mix 3S Yoga: Not the First Netbook, But Probably The Best posted first on grassroutespage.blogspot.com
0 notes
droneseco · 5 years
Text
One Mix 3S Yoga: Not the First Netbook, But Probably The Best
Our verdict of the One-Netbook One Mix 3S: If you absolutely need maximum portability but want all the power you can get, this is a good buy, but the same money can buy you a much better laptop.710
It’s official: netbooks are back. It started with a few smaller computers here and there, but it wasn’t long before the trickle became a torrent. Things have changed though. The netbooks of 10 years ago were tiny and portable, yes, but they were also massively underpowered.
On the other hand, the netbooks we’re seeing released now can be as powerful as any other laptop, just scaled down in size. The One Mix 3S is an example of just how mighty these pint-sized powerhouses can be. Whether you actually need one for yourself is another matter entirely.
youtube
One Mix 3S Hardware Specifications
The One Mix 3S comes in a few different variations. The specs of our review unit, which appears to be the base model, are reflected below. You can also get it with major power upgrades like an i7 processor, though obviously, it will cost you more.
CPU: Intel Core M3-8100Y (1.1GHz to 3.4GHz)
GPU: Intel UHD Graphics 615
RAM: 16GB
Storage: 512 NVMe SSD
Screen: 8.4″, 2560×1600 55Hz multi-touch IPS display @ 358ppi
Battery: 8,600 mAh 3.7V
Dimensions: 204mm x 129mm x 14.9mm or 8.03 x 5.07 x 0.58 inches
Wireless: Dual-band 802.11ac / Bluetooth 4.0
Ports: USB-C, USB 3.0, Micro HDMI, Headset, TF card slot
Other: Fingerprint scanner, optional stylus with 4,096 pressure levels
Price: under $900 at GeekBuying.com
The One Mix 3S runs Windows 10 Home. We’d have preferred to see Windows 10 Pro, but this helps keep the initial price down, and you can always upgrade easy enough from the Windows Store.
Body and Design
  As you can likely tell by either reading the size specs above or just looking at the photos, this is a tiny computer. Because of the overall small dimensions, it looks a little on the chunky side, but compared to my MacBook Pro, it’s just as thin.
Our review unit came in a sleek black finish that was stunning out of the box. There’s a reason I say “was” instead of “is.” This is a bit of a fingerprint magnet. Within a few minutes of handling the laptop, it didn’t look nearly as sleek as it had out of the box. A quick wipe with a microfibre cloth had it back to its original look.
The netbooks of old were often flimsy, but this is anything but. Like the One Mix 2S, this is machined from solid aluminum, giving it a very solid feel. This extends to the netbook as a whole, especially the hinges for the screen, which have been the downfall of many a laptop.
Given the small size, this is among the most portable computers you’ll find. It won’t quite fit in your pocket as advertised, but it will certainly fit in a purse and can easily fit into all but the most jam-packed backpacks.
Display and Brightness
The screen is almost overkill, with a resolution of 2560 x 1600 packed into an 8.4-inch screen size. That’s close to the resolution of a Surface Pro, but in a much smaller screen size. This results in an ultra-crisp pixel density of 358 pixels per inch.
The color isn’t as good as the overall display, however. Colors are a little muted and washed out when compared to the displays on other gadgets I had handy. Granted I had to look, but this was definitely noticeable.
One issue I’m not sure will be so obvious, but did give me pause during testing, is the refresh rate. Instead of the usual 60Hz refresh rate, the One Mix 3S runs at 55Hz. This caused a noticeable strobing effect when I was shooting photos and video.
Even after I was done shooting, I could swear I noticed a vague strobing effect when looking at the screen. This could have been in my head, but it’s worth keeping in mind if you’re sensitive to this sort of thing.
Is the Tiny Keyboard Usable?
As with any computer like this, the keyboard is small. There’s no getting around that. Because of its small size, the keyboard layout has changed. This is the actual problem with these netbooks, rather than the individual key size.
If a company somehow manages to fit a standard keyboard layout into a netbook-sized keyboard, they’ll deserve some sort of award. Until then, it’s a matter of figuring out which sacrifices you can comfortably live with and which are simply untenable.
With the One Mix 3S, there are a few specific problems. The first and most common is the tiny space bar. For frequent users of the tab key, it’s even worse, as this key has been relocated to the function keys. The quote keys are down to the right of the space bar as well.
One plus of the tiny keyboard is that it is backlit. Strangely though, the backlight is aggressive about turning off. This is likely in the name of saving battery life, but if you frequently type in a dark room, this will get annoying quickly.
What’s the Point of the Pointer?
While older netbooks usually still included a touchpad, these are hard to find on modern netbooks. In the case of the One Mix 3S, it uses an infrared induction mouse. We’ve seen these before, and while not ideal, they are usable.
The problem in the case of the One Mix 3S is the location. On older One Mix models and similar netbook-style computers like the Chuwi MiniBook 8″, the “mouse” is located in the middle of the spacebar, similar to where a ThinkPad-style trackpoint would be. Here it’s below the space bar and in between the left and right mouse buttons, making it unwieldy to use.
As I and others have said before, an actual rubber trackpoint would be both smaller and easier to use. This could simply be familiarity, but the induction mouse would be maddening if it was your only option for navigating the OS.
Fortunately, the screen on the One Mix 3S is a touchscreen, which alleviates almost all of the mouse-related issues. Even better, our review unit shipped with a stylus, which made selecting tiny user interface elements possible.
The touchscreen and stylus allow the One Mix 3S to pull double duty as a Windows 10 tablet. It’s not going to give Microsoft’s Surface Pro devices a run for their money, but it works quite well in tablet mode.
Performance
As reading the specs might have clued you in, the One Mix 3S isn’t a toy. This is a serious computer that just happens to be in an ultra-portable form factor. While I noticed limitations from the screen size and keyboard, I never felt like the hardware was slowing me down.
Thermal issues can pop up on smaller form factor computers like this, so you may be glad to know the fans are pretty aggressive. They don’t get overly loud, but they tend to kick in as soon as the computer starts to work at all.
I’m not quite sure why, but when I tried running benchmarks, they didn’t line up with the performance I noticed. Geekbench gave me scores of 654 for single-core performance and 1263 for multi-core performance, much slower than scores I’ve seen on less powerful hardware. The computer certainly felt more responsive than these scores would indicate.
Another area where the One Mix 3S excelled was Wi-Fi speed. Unlike the One Mix 2S, which was hampered by problematic Wi-Fi, I never noticed any issues with the One Mix 3S.
Battery Life
The One Mix 3S sports an 8,600mAh battery, bigger than the 6,500mAh battery in the One Mix 2S. Of course, this model has a bigger screen, but battery life is still improved overall.
I averaged around 6 hours of life per charge during my testing, though this varied based on how much I was pushing the hardware. The good news is that the One Mix 3S doesn’t seem to sip much power when the screen is shut, so you don’t need to worry about shutting it down to save power.
Like other modern netbooks, the One Mix 3S uses USB-C for charging. It only has a single USB-C port, which can be problematic, but like other similar computers, you can always get around this by using a hub.
How long it takes to charge will vary based on the charger you’re using. With the included 12V, 2.5A charger, charge time was around 2 to 3 hours.
Should You Buy the One Mix 3S?
With some other modern netbooks, I’ve noticed that while the specs seem powerful enough, actually using them still feels slow. That isn’t the case with the One Mix 3S. This feels like a computer that, with the exception of editing video, I’d have no problem using on a day to day basis. That is assuming that I’d eventually get used to the keyboard.
That said, if you’re just looking for a moderately powerful laptop, the One Mix 3S shouldn’t be at the top of the list. If portability is your first concern, with power coming in a close second, then the One Mix 3S is absolutely worth looking at. The sheer amount of RAM alone will be very useful when it comes to everyday computing tasks.
If you’re not in a rush, you might just want to wait a few months and see what comes down the pike next. We’re in a great time for netbooks, with each new model seemingly better than the last. If that trend continues, I can’t wait to see what the future has in store.
The One Mix 3S, configured as we reviewed it, will cost you between just under $900 currently. That certainly isn’t cheap, and as mentioned above, that will buy you plenty of power in a standard-sized laptop. That said, if you crave portability but can’t sacrifice the power, you could certainly do a lot worse.
Enter the Competition!
One-Netbook One Mix 3S Giveaway
Read the full article: One Mix 3S Yoga: Not the First Netbook, But Probably The Best
One Mix 3S Yoga: Not the First Netbook, But Probably The Best published first on http://droneseco.tumblr.com/
0 notes
suzanneshannon · 4 years
Text
Review of the Surface Book 3 for Developers
I was offered a Surface Book 3 to use as a loaner over the the last 5 weeks. I did a short video teaser on Twitter where I beat on the device with a pretty ridiculous benchmark - running Visual Studio 2019 while running Gears of War and Ubuntu under WSL and Windows Terminal. I have fun. ;)
Hey they loaned me a @surface book 3! So...I threw EVERYTHING at it...Visual Studio, Gears of War, Ubuntu/WSL2/Windows...*all at the same time* because why not? LOL (review very soon) pic.twitter.com/FmgGCBUGuR
— Scott Hanselman (@shanselman) May 14, 2020
Size and Weight
My daily driver has been a Surface Book 2 since 2017. The new Surface Book 3 is the exact size (23mm thick as a laptop) and weight (3.38 and 4.2 lbs.) as the SB2. I have had to add a small sticker to one otherwise I'd get them confused. The display resolutions are 3000×2000 for the 13.5-inch model and 3240×2160 for the 15-inch one that I have. I prefer a 15" laptop. I don't know how you 13" people do it.
Basically if you are a Surface Book 2 user the size and weight are the same. The Surface Book 3 is considerably more power in the same size machine.
CPU and Memory
They gave me an i7-1065G7 CPU to test. It bursts happily over 3.5 Ghz (see the compiling screenshot below) and in my average usage hangs out in the 2 to 1.8 range with no fan on. I regularly run Visual Studio 2019, VS Code, Teams, Edge (new Edge, the Chromium one), Ubuntu via WSL2, Docker Desktop (the WSL2 one), Gmail and Outlook as PWAs, as well as Adobe Premiere and Audition and other parts of the Creative Suite. Memory usually sits around 14-18 gigs unless I'm rendering something big.
It's a 10th gen Intel chip and as the Surface Book 3 can detach the base from the screen, it's both a laptop and tablet. I gleaned from Anandatech that TDP is between 10 and 25W (usually 15W) depends on what is needed, and it shifts frequencies very fast. This is evident in the great battery life when doing things like writing this blog post or writing in Edge or Word (basically forever) versus playing a AAA game or running a long compile, building containers, or rendering a video in Premiere (several hours).
FLIP THE SCREEN AROUND? You can also when docked even reverse the screen! Whatever do you mean? It's actually awesome if you want an external keyboard.
All this phrased differently? It's fast, quickly, when it needs to be but it's constantly changing the clock to maximize power/thermals/battery.
SSD - Size and Speed
The device I was loaned has a Toshiba KXG60PNV2T04 Hard Drive 2TB NVMe M.2 that's MASSIVE. I'm used to 512G or maaybe a 1TB drive in a Laptop. I'm getting used to never having to worry about space. Definitely 1TB minimum these days if you want to play games AND do development.
I ran a CrystalBenchmark on the SSD and it did 3.2GB/s sequential reads! Sweet. I feel like the disk is not the bottleneck with my development compile tests below. When I consulted with the Surface team last year during the conception of the Surface Book 3 I pushed them for faster SSDs and I feel that they delivered with this 2TB SSD.
GPU - Gaming and Tensorflow
The 13.5-inch model now comes with an NVIDIA GeForce GTX 1650 Max-Q GPU with 4GB of GDDR5 memory in its Core i7 variant, while the 15-inch unit features a NVIDIA GeForce GTX 1660 Ti Max-Q with 6GB of GDDR6 memory. When running the Gears 5 Benchmark while plugged in (from the Extras menu, Benchmark) is has no issues with the default settings doing 60fps for 90% of the benchmark with a few dips into the 57 range depending what's on screen.
It's not a gaming machine, per se, but it does have a NVIDIA GeForce GTX 1660 Ti so I'm basically able to 1080p 60fps AAA games. I've played Destiny 2, Gears of War 5, and Call of Duty Modern Warfare on default settings at 60 fps without issue. The fan does turn on but it's very manageable. I like that whenever we get back into hotels I'll be able to play some games and develop on the same machine. The 15" also includes an Xbox Wireless Adapter so I just paired my controller with it directly.
I was also able to run Tensorflow with CUDA on the laptop under Windows and it worked great. I ran a model against some video footage from my dashcam and 5.1 gigs of video RAM was used immediately and the CUDA engine on the 1660Ti is visible working in Taskman. The commercial SKU has an NVIDIA Quadro RTX 3000 that is apparently even more tuned for CUDA work.
Developer Performance
When I built my Intel i9 Ultimate Desktop 3.0 machine and others, I like to do compile tests to get a sense of how much you can throw at machine. I like big project compiles because they are a combination of a lot of disk access and a lot of parallel CPU work. However, some projects do have a theoretical maximum compile speed because of the way the dependences flesh out. I like to use Orchard Core for benchmarks.
Orchard Core is a fully-featured CMS with 143 projects loaded into Visual Studio. MSBUILD and .NET Core supports both parallel and incremental builds.
A warm build of Orchard Core on IRONHEART my i9 desktop takes just under 10 seconds.
My 6 year old Surface Pro 3 builds it warm in 62 seconds.
A totally cold build (after a dotnet clean) on IRONHEART takes 33.3 seconds.
My Surface Pro 3 builds it cold in 2.4 minutes.
I'll do the same build on both my Surface Book 2 and this new Surface Book 3 to compare. I've excluded the source folders from Defender as well as msbuild.exe and dotnet.exe. I've also turned off the Indexer.
A cold build (after a dotnet clean) on this Surface Book 3 takes 46 seconds.
A warm build is 16.1 seconds
A cold build (after a dotnet clean) on my Surface Book 2 takes 115 seconds.
It's WAY faster than my Surface Book 2 which has been my daily driver when mobile for nearly 3 years!
Benchmarks are all relative and there's raw throughput, there's combination benchmarks, and all kinds of things that can "make a chart." I just do benchmarks that show if I can do a thing I did before, faster.
You can also test various guesses if you have them by adding parameters to dotnet.exe. For example, perhaps you're thinking that 143 projects is thrashing to disk so you want to control how many CPUs are used. This has 4 physical cores and 8 logical, so we could try pulling back a little
dotnet build /maxcpucount:4
The result with Orchard Core is the same, so there is likely a theoretical max as to how fast this can build today. If you really want to go nuts, try
dotnet build -v diag
And dig through ALL the timing info!
Webcam Quality
Might be odd to add this as its own section but we're all using our webcams constantly right now. I was particularly impressed with the front-facing webcam. A lot of webcams are 720p with mediocre white balance. I do a lot of video calls so I notice this stuff. The SB3 has a 1080p front camera for video and decent light pickup. When using the Camera app you can do up to 5MP (2560x1920) which is cool. Here's a pic from today.
Ports and Power and Sound and Wi-Fi
The Surface Book 3 has just one USB-C port on the right side and two USB 3.1 Gen 2s on the left. I'd have liked one additional USB-C so I could project on stage and still have one additional USB-C available...but I don't know what for. I just want one more port. That said, the NEW Surface Dock 2 adds FOUR USB-C ports, so it's not a big deal.
It was theoretically possible to pull more power on the SB2 than its power supply could offer. While I never had an issue with that, I've been told by some Destiny 2 players and serious media renderers that it could happen. With the SB3 they upped the power supply with 65W for the base 13.5-inch version and a full 127W for the 15-inch SKUs so that's not an issue any more.
I have only two Macs for development and I have no Thunderbolt devices or need for an eGPU so I may not be the ideal Thunderbolt consumer. I haven't needed it yet. Some folks have said that it's a bummer the SB3 doesn't have it but it hasn't been an issue or sticking point for any of my devices today. With the new Surface Dock 2 (below) I have a single cable to plug in that gives me two 4k monitors at 60Hz, lots of power, 4 USB-C ports all via the Dock Connector.
I also want to touch on sound. There is a fan inside the device and if it gets hot it will run. If I'm doing 1080p 60fps in Call of Duty WarZone you can likely hear the fan. It comes and goes and while it's audible when it's one, when the CPU is not maxed out (during 70% of my work day) the Surface Book 3 is absolutely silent, even when running the monitors. The fan comes on with the CPU is bursting hard over 3Ghz and/or the GPU is on full blast.
One other thing, the Surface Book 3 has Wi-Fi 6 even though I don't! I have a Ubnt network and no Wi-Fi 6 mesh points. I haven't had ANY issues with the Wi-Fi on this device over Ubnt mesh points. When copying a 60 gig video file over Wi-Fi from my Synology NAS I see sustained 280 megabit speeds.
The New Surface Dock - Coming May 26th
I'm also testing a pre-release Surface Dock 2. I suspect they wanted me to test it with the Surface Book 3...BUT! I just plugged in every Surface I have to see what would happen.
My wife has a Surface Laptop 2 she got herself, one son has my 6 year old old Surface Pro 3 while the other has a Surface Go he got with his allowance. (We purchased these over the last few years.) As such we have three existing Surface Docks (original) - One in the kids' study/playroom, one in the Kitchen as a generalized docking station for anyone to drop in to, and one in my office assigned me by work.
We use these individual Surfaces (varying ages, sizes, and powers) along with my work-assigned Surface Book 2 plus this loaner Surface Book 3, so it's kind of a diverse household from a purely Surface perspective. My first thought was - can I use all these devices with the new Dock? Stuff just works with a few caveats for older stuff like my Surface Pro 3.
RANDOM NOTE: What happens when you plug a Surface Pro 3 (released in 2014) into a Surface Dock 2? Nothing, but it does get power. However, the original Surface Dock is great and still runs 4096 x 2160 @30Hz or 2960 x 1440 @60Hz via mini DisplayPort so the Pro 3 is still going strong 6 years out and the kids like it.
So this Surface Dock 2 replaces the original Dock my office. The Surface Dock 2 has
2x front-facing USB-C ports (I use these for two 4k monitors)
2x rear-facing USB-C ports
2x rear-facing USB-A 3.2 (10Gbps) ports
1x Gigabit Ethernet port
1x 3.5mm audio in/out port
Kensington lock slot - I've never used this
First, that's a lot of USB-C. I'm not there yet with the USB-C lifestyle, but I did pick up two USB-C to full-size DisplayPort cables at Amazon and I can happily report that I can run both my 4k monitors at 60hz plus run the main Surface Book 3 panel. The new Dock and its power supply can push 120 watts of power to the Surface with a total of 199 watts everything connected to the dock. I've got a few USB-C memory sticks and one USB-C external hard drive, plus the Logitech Brio is USB 3, so 6 total ports is fine with 4 free after the two monitors. I also Gigabit wired the whole house so I use the Ethernet port quite happily.
Initially I care about one thing - my 4k monitors. Using the USB-C to DisplayPort cables I plugged the dock into two Dell P2715Q 4ks and they work! I preferred using the direct cables rather than any adapters, but I also tested a USB-C to HDMI 2.0 adapter I got in 2018 with some other Dell monitors in the house and that worked with the Surface Book 3 as it had previously with the Book 2.
SURPRISE NOTE: How does the super-thin Surface Pro X do when plugged into a Surface Dock 2? Amazing. It runs two 4k monitors at 60 Hz. I don't know why I was shocked, it's listed on the support page. It's a brand new device, but it's also the size and weight of an iPad so I was surprised. It's a pretty amazing little device - I'll do another post on just the ARM-based Surface Pro X another time.
One final thing about the new Dock. The cable is longer! The first dock had a cable that was about 6" too short and now it's not. It's the little things and in this case, a big thing that makes a Dock that much nicer to use.
Conclusion
All in all, I'm very happy with this Surface Book 3 having been an existing Surface Book 2 user. It's basically 40-50% faster, the video card is surprisingly capable. The SSD is way faster at the top end. It's a clear upgrade over what I had before, and when paired with the Surface Dock 2 and two 4k monitors it's a capable developer box for road warriors or home office warriors like myself.
Sponsor: Have you tried developing in Rider yet? This fast and feature-rich cross-platform IDE improves your code for .NET, ASP.NET, .NET Core, Xamarin, and Unity applications on Windows, Mac, and Linux.
© 2020 Scott Hanselman. All rights reserved.
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
      Review of the Surface Book 3 for Developers published first on https://deskbysnafu.tumblr.com/
0 notes