Jump to content

Computers?


Fraggiebaby

Recommended Posts

48 minutes ago, Geano said:

I agree completely on the delay to not step on the toes of the consoles seeing as AMD has had these prior console based chips in the works for some time now, and releasing a full on video card for PC would more, or less cannibalize the market as even the slightly technically inclined user would realize what they are buying. I also think the idea of a onboard SSD for thees video cards would be pretty awesome though it would be more, or less a stoppable installation type situation. The sizes for these SSDs would likely only be large enough to hold one maybe 2 games, and for the AAA titles which readily exceed 100GB these those forget putting more then one of those on there. lol  I do think though as flash storage becomes cheaper this will be a much more viable option. 

What's "expensive" or "cheap" depends on who you talk to.  People (though not me!) are already spending over $700 - even $1,200! - on a gfx card.  NVMe storage is currently priced around 10 cents per GB.  That means 1TB of flash would only be maybe an extra $100.  My current desktop only has a 500GB SSD and I have several games on it.  This includes several different versions Kerbal Space Program, each with its own set of installed mods.  I don't know how accessible the storage would be on one of the (rumored) cards, but if I could also use it to store regular files then I could maybe do without my M.2 drive on the motherboard and the extra cost for the GPU would be a wash.  At that point it would be down to things like upgradability and overall system performance. 

If someone were to offer me a GFX card with performance of a 2080 and 1TB of ultra-fast SSD storage for $500 then I might be tempted, especially if the flash on the card were socketed and upgradable.  Maybe 2xM.2 slots say.  You can already get quad-M.2 PCI cards for less than $100.  So, take a regular GFX card, rework it a bit to add the flash controller and manage things like heat and power, and raise the price to match the extra performance.  Existing GFX cards already have things like heat sinks and memory controllers anyway.  It wouldn't even have to cost much beyond the price of the actual flash memory chips, and like I said flash memory is getting pretty cheap these days.

Link to comment
Share on other sites

10 minutes ago, efaardvark said:

What's "expensive" or "cheap" depends on who you talk to.  People (though not me!) are already spending over $700 - even $1,200! - on a gfx card.  NVMe storage is currently priced around 10 cents per GB.  That means 1TB of flash would only be maybe an extra $100.  My current desktop only has a 500GB SSD and I have several games on it.  This includes several different versions Kerbal Space Program, each with its own set of installed mods.  I don't know how accessible the storage would be on one of the (rumored) cards, but if I could also use it to store regular files then I could maybe do without my M.2 drive on the motherboard and the extra cost for the GPU would be a wash.  At that point it would be down to things like upgradability and overall system performance. 

If someone were to offer me a GFX card with performance of a 2080 and 1TB of ultra-fast SSD storage for $500 then I might be tempted, especially if the flash on the card were socketed and upgradable.  Maybe 2xM.2 slots say.  You can already get quad-M.2 PCI cards for less than $100.  So, take a regular GFX card, rework it a bit to add the flash controller and manage things like heat and power, and raise the price to match the extra performance.  Existing GFX cards already have things like heat sinks and memory controllers anyway.  It wouldn't even have to cost much beyond the price of the actual flash memory chips, and like I said flash memory is getting pretty cheap these days.

See thats the thing serviceability for the average user. Now when I say average user in this particular case I am speaking of the average system builder. The prebult computer industry even with PCs has started to move towards a more proprietary scene even with gaming {Cs with mobos, and other parts that are made to suite just that line rather then being general purpose. While this has always been true to a point its becoming more widespread outside the custom builder market. So I would say as consumers we should air on the side of causation with to much interconnected designed parts as really we vote with out $, and thats what the companies will invest in. With that said I do think having a small amount of flash storage even for a portion of a game to make it load faster would be hella dope/

Link to comment
Share on other sites

Funny how Intel is saying benchmarks don't matter now that they're losing so badly to AMD.  :D

Seriously though... I hope Intel gets their act together soon.  Otherwise in a couple/few years we'll have AMD forcing us to pay insane prices for mediocre, incremental performance increases.  Don't get me wrong, I like AMD's current lineup and pricing.  My current system is all-in AMD, with a x470 motherboard, a 2700X CPU, and an "old" Radeon RX480 GPU.  I'll probably be replacing the RX480 when the RDNA2 boards come out, and I'm planning on upgrading the motherboard and CPU in a year or so to Ryzen 9 once DDR5 is out and reasonably priced.  (Or maybe even threadripper, depending on budget.)  I just have no illusions about how publicly-traded companies behave.  If AMD switches with Intel for the top spot and it takes Intel a few years to catch up then I have no doubts that we'll all eventually be cursing AMD's prices and lack of innovation as loudly as we were Intel's before Ryzen and Epyc came along.  Healthy competition is clearly a Good Thing for us buyers.

Link to comment
Share on other sites

I'm relatively new to PC gaming (with the exception of Sims and Planet Zoo). I bought a prebuilt computer that comes in this Wednesday (a friend of mine who builds PCs helped me pick it out). It's one that I would be able to gut out and add new parts in if I wanted to in the future. 

  • Like 1
Link to comment
Share on other sites

I’m about as tech-savvy as your average American, but I just wanted to make a contribution to this popular-seeming thread. Computers? I use a Windows 7. Seriously, if you stalked me, on the off chance I wasn’t watching anime or reading manga, I would be playing Purble Palace. I’ve had people tell me it’s “gross”, or that I’m some kind of premature boomer (I’m not allowed to drive a car yet, buddy... [although admittedly I do love “outdated” stuff.]) Anyway, in my opinion new computers are overrated because why would you want all those special features when you can have internet connection, fun games, and pre-set musical wonders? 

.....aaaaand I just realized people on here are using weird acronyms. I feel like Kaminari being brain-dead.

Link to comment
Share on other sites

On 6/11/2020 at 2:49 PM, OtakuKid said:

...in my opinion new computers are overrated because why would you want all those special features when you can have internet connection, fun games, and pre-set musical wonders

I can agree with this to a certain extent.  Certainly for all the "basic" stuff like email, word-processing, web surfing, streaming entertainment etc. ... you're pretty much fine on any computer running any operating system these days.  Even for most gaming you're covered.  There are still issues such as certain software not running on certain operating systems, but even there the functionality exists using different software.  Get yourself a basic computer and a good word processor, spreadsheet, web browser etc. and you're good to go.  There are fun games on every platform.  Even a $80 raspberry Pi running linux and Libre Office and connected to a $50 HD monitor can make a useful desktop system, including for streaming things like youtube, Crunchyroll, and Spotify over the web.  (Or at least the problems there are not related to the performance of either the hardware or the software, and have not been for many, many years.)

DMCA nonsense aside, the place where we always run into trouble is the cutting-edge stuff that "needs" the performance.  Let me back up a bit.  I've been around for a while.  (Speaking of boomers.)  When I was in middle school they didn't have home computers or the internet.  I took "drafting" (hand-drawing blueprints and such) and "typing" (on a mechanical typewriter no less!) as my elective, career-related subjects because computers and computing didn't exist in the classroom.  I never saw any computer more powerful than a hand calculator in any of my classrooms until my senior year of high school.  Even in college programming classes our labwork and tests were done on text-only terminals connected to the school's mainframe.  I've been around since the beginning of home computers.  Our family's first computer was one of the original Apple ][ computers.

The reason we bought it?  To play games.  :) 

I speak from personal experience when I say that games have always been a big driver of personal computer technology.  We didn't realize it at the time, but that's how it worked out.  I first learned BASIC programming on the Apple ][, but I spent most of my actual time on it playing games.  Later I learned C/C++ on a Commodore Amiga.. that I bought as much because of the cool games that were available for it at the time as because I could do my college programming classwork on it without driving in to the lab to do it.  The purchase of every upgraded part or new system was always for better performance, usually because I was disappointed in the performance of the latest game on my existing computer.  Sure we've use our computers for other things along the way, including for serious applications like for work and education that pay the bills.  Today computers are the primary tool for most forms of non-game entertainment too, especially since the Internet took off.  But games have always been the software that most pushed against the upper bounds of performance on all my systems, and every new hardware purchase was made partly with an eye toward what games it would enable.  Basically, better computers => better games.

(And better games => better computers.  It wasn't just me and my family.  Millions of people spending money on upgrades and new computers over the decades has created and funded the huge software and hardware gaming industries.  These days, at least for the home market, hardware development and marketing is aimed mainly at gamers and gaming.  Certainly GPUs are currently marketed mainly for gamers.  You don't need "RTX" ray-tracing or any of the other marketing buzzwords for word processing, spreadsheets, or email.  Even hardware encoding & decoding for streaming is pretty much a given these days on any GPU.  CPUs too are usually benchmarked by how many FPS they get in popular games.  The whole game-centric console industry alone is worth billions of USD per year.  Here in the US we spend far more on games and gaming than we give to NASA every year!  That much money going into development produces huge advances in capabilities over time.  If you're a cutting-edge gamer then you're always looking for the next upgrade.  If you're interested in pretty much any other aspect of computing then the amazing performance of the hardware you're using now is due in no small part to games and gamers putting pressure on the PC industry to develop that performance.)

All that said, I also readily agree that there is a cost associated with staying on a perpetual upgrade treadmill.  Some people go way overboard, spending far more - and more often - than is rational for sometimes questionable gains.  Don't believe everything that the murketroids from AMD, Intel, nvidia, Sony, EA, Blizzard, etc. tell you.  (I'm sorry, overspending on a 2070 Super or a GTX 1080 is not worth it just so I can be "competitive" in CoD: Warzone!)  Still, if you back off from the bleeding edge even just a little bit then every once in a while an upgrade is reasonably priced enough to be worth it.  :)

 

  • Like 1
Link to comment
Share on other sites

4 hours ago, efaardvark said:

I'd been thinking of trying the Brave web browser but this reddit thread takes it off the table for me.  Sticking with firefox (+ adblock and umatrix ) for now.

 

This was in the news a lot recently yes. Though they have stopped doing so since the story broke like wildfire Though I know your an open source guy. I like open source software too though I am ok with closed source things like Windows. In terms of Brave they switched the default to no longer do this for so they say.

Just a good reminder for us all to chick all our configurations, and settings after installing any software. Even a lot of open source software has a default dynastic data collection togged on by default. Though I do see the argument of it being bit shady. 

Link to comment
Share on other sites

30 minutes ago, Geano said:

This was in the news a lot recently yes. Though they have stopped doing so since the story broke like wildfire Though I know your an open source guy. I like open source software too though I am ok with closed source things like Windows. In terms of Brave they switched the default to no longer do this for so they say.

Just a good reminder for us all to chick all our configurations, and settings after installing any software. Even a lot of open source software has a default dynastic data collection togged on by default. Though I do see the argument of it being bit shady. 

Unfortunately it reveals a pattern of behavior that puts me off.  Yeah they readily "fixed" it this time.  But they've done similar stuff in the past, they didn't seem to be all that upset about doing it again this time, and they didn't promise not to do it again in the future.

Not that open-source is free from this sort of thing either.  Canonical, the maintainer of the Ubuntu Linux distribution, also got called out for things like putting an Amazon icon on the desktop and sending search terms for "desktop" searches (i.e. supposedly local searches of your own files) to Amazon.  Nothing built by humans is immune, though some environments make it easier to audit and monitor your tools than others.  Consider the Windows-10 situation for example.  Also, in the case of the Ubuntu/Amazon thing above there was already a prefs setting to turn it off or on.  One could argue that the default should not have been to have it turned on, and there were a number of technical issues with the implementation as well, but it wasn't exactly hidden either, and it didn't require an update to "fix".

  • Agree 1
Link to comment
Share on other sites

2 hours ago, efaardvark said:

Unfortunately it reveals a pattern of behavior that puts me off.  Yeah they readily "fixed" it this time.  But they've done similar stuff in the past, they didn't seem to be all that upset about doing it again this time, and they didn't promise not to do it again in the future.

Not that open-source is free from this sort of thing either.  Canonical, the maintainer of the Ubuntu Linux distribution, also got called out for things like putting an Amazon icon on the desktop and sending search terms for "desktop" searches (i.e. supposedly local searches of your own files) to Amazon.  Nothing built by humans is immune, though some environments make it easier to audit and monitor your tools than others.  Consider the Windows-10 situation for example.  Also, in the case of the Ubuntu/Amazon thing above there was already a prefs setting to turn it off or on.  One could argue that the default should not have been to have it turned on, and there were a number of technical issues with the implementation as well, but it wasn't exactly hidden either, and it didn't require an update to "fix".

Wow that sucks though I guess I am not surprised that Ubuntu would do this. Over the years they seem to have gotten more, and more influenced by this sort of thing. I presume you're fering to the Windows phone home feature they use in every version except Enterprise, and Edu. You can run a few exes to remove it in other versions though its super techy and not something average users would even think about. Its sad, but then Windows dose allow you to modify a lot for a so called closed source solution so they figue those who want to take the time to mod the registry will, but most won't so there will be a steady stream of diagnostic data for them to improve Windows, In my opinion if they stopped there I'd be ok with that as most devs do use this to improve their software. Though they push for much more collection, and even more if you sign in via your Microsoft account. Of course all these things can be opted out of or minimized though they don't make it easy for the average non tech user.

When it comes to Brave thanks for the heads up on that I am still fairly new to it, but I'll keep my eye on news related to it for things like this. Has Firefox had things like this happen that you know of recently?

Link to comment
Share on other sites

1 hour ago, Geano said:

Wow that sucks though I guess I am not surprised that Ubuntu would do this.

It is a money thing.  Most linux distributions make their money with support contracts for corporate users.  As long as they abide by the GNU license rules I'm fine with that because it employs people to make a living working on linux software, which ultimately the entire community benefits from.  Canonical seems more open to .. experimentation in regards to income opportunities.  That dash thing was back in 2012 though and they removed the "feature" entirely in 2016.  Since then they've been behaving themselves, at least in that respect.

Edited by efaardvark
  • Winner 1
Link to comment
Share on other sites

On 6/14/2020 at 12:41 AM, efaardvark said:

It is a money thing.  Most linux distributions make their money with support contracts for corporate users.  As long as they abide by the GNU license rules I'm fine with that because it employs people to make a living working on linux software, which ultimately the entire community benefits from.  Canonical seems more open to .. experimentation in regards to income opportunities.  That dash thing was back in 2012 though and they removed the "feature" entirely in 2016.  Since then they've been behaving themselves, at least in that respect.

Right I would not dressage after all gotta keep the funding for these open source projects free, and powerful. 

So I'd I am wondering about something. When did you switch to Linux, and what keeps you using it besides the fact it is open source?

Link to comment
Share on other sites

On 6/13/2020 at 9:09 PM, efaardvark said:

I'd been thinking of trying the Brave web browser but this reddit thread takes it off the table for me.  Sticking with firefox (+ adblock and umatrix ) for now.

 

Funny, I tried out Brave 2 days before the thread was posted and really like it but then yeah, same: back to firefox +Privacy badger, ublock and "ddg privacy essentials"!

 

 

On 6/11/2020 at 11:49 PM, OtakuKid said:

I’m about as tech-savvy as your average American, but I just wanted to make a contribution to this popular-seeming thread. Computers? I use a Windows 7. Seriously, if you stalked me, on the off chance I wasn’t watching anime or reading manga, I would be playing Purble Palace. I’ve had people tell me it’s “gross”, or that I’m some kind of premature boomer (I’m not allowed to drive a car yet, buddy... [although admittedly I do love “outdated” stuff.]) Anyway, in my opinion new computers are overrated because why would you want all those special features when you can have internet connection, fun games, and pre-set musical wonders? 

.....aaaaand I just realized people on here are using weird acronyms. I feel like Kaminari being brain-dead.

I had to use Windows 7 at my last holiday internship, and really liked it. Faster and more stable than that sh**** windows 10. I can see why a huge tech company (and I mean really huge, so they probably know their stuff) would prefer it aver windows 10. It works and does the job.

 

On 6/14/2020 at 1:39 AM, efaardvark said:

Apple ][

We have one in our cellar, proudly displayed in my father's "man cave" of sort (well it's more like just everything that has to do with tech and modelmaking crammed into a really tiny space). I've never used it though but it supposedly still works... For some reason I really like the aesthetics of old tech; the bulky, beige frame just has it's own appeal. And those tactile MK are just really, really awesome. (too bad I'm a minimalist when it comes to tech [in terms of looks] for actual every-day usage.) Oh, and the design of the old Thinkpads when IBM still made them.

 

2 hours ago, Geano said:

Right I would not dressage after all gotta keep the funding for these open source projects free, and powerful. 

One problem is, that that stuff was in plain sight in git hub for a long time, afaik. Makes you wonder how much privacy malpractice is "hidden" in plain sight in open source and no one simply found it or bothered with looking through the code.

 

As for Ubuntu, while the controversy is kinda old by now, what I don't like it that Ubuntu goes more and more into the direction of forcing the snap store down people's throats. That's a deal breaker for me. (also , yum and pacman are way better than apt!) If you really want to know what happens on your system and exactly want to say what's on your system from the very start you gotta go with Arch Linux but then again, most people (Including me) usually don't want to go through some endlessly tedious installation just to have a working system to browse on the internet.

  • Cool (Kakkoii) 1
Link to comment
Share on other sites

11 hours ago, leinwandname said:

One problem is, that that stuff was in plain sight in git hub for a long time, afaik. Makes you wonder how much privacy malpractice is "hidden" in plain sight in open source and no one simply found it or bothered with looking through the code.

 

As for Ubuntu, while the controversy is kinda old by now, what I don't like it that Ubuntu goes more and more into the direction of forcing the snap store down people's throats. That's a deal breaker for me. (also , yum and pacman are way better than apt!) If you really want to know what happens on your system and exactly want to say what's on your system from the very start you gotta go with Arch Linux but then again, most people (Including me) usually don't want to go through some endlessly tedious installation just to have a working system to browse on the internet.

Yah I think in the case for Brave its more that fact they tote themselves as the so called privacy premier solution compared to their compaction like chrome, or edge.

Yah I hear you there I am a fan of the old way Ubuntu worked as even in Ubuntu I mostly just relay on the command line for most things. I would also agree that Arch is definitely not for everyone. Though it is fun to play with, and learn about.

Link to comment
Share on other sites

15 hours ago, leinwandname said:

One problem is, that that stuff was in plain sight in git hub for a long time, afaik. Makes you wonder how much privacy malpractice is "hidden" in plain sight in open source and no one simply found it or bothered with looking through the code.

At least that is an option when you have the source code available.  Also, if you otherwise like the code base then you can always create your own fork without the malformed bits.

15 hours ago, leinwandname said:

As for Ubuntu, while the controversy is kinda old by now, what I don't like it that Ubuntu goes more and more into the direction of forcing the snap store down people's throats. That's a deal breaker for me. (also , yum and pacman are way better than apt!) If you really want to know what happens on your system and exactly want to say what's on your system from the very start you gotta go with Arch Linux but then again, most people (Including me) usually don't want to go through some endlessly tedious installation just to have a working system to browse on the internet.

I definitely hear this.  See my other post about troubles getting optifine installed in minecraft because Canonical repackaged their minecraft into a snap that put the .minecraft directory in an obscure place.  And disconnected my own, working, minecraft install in the process.  Why, Canonical?  Why? 

As you say however, there's plenty of other linux distros out there.  If you don't like one, pick another.  I'm not sure what the point of Ubuntu is anymore anyway.  Corporate types will go with SuSE or RedHat or CentOS.  Even Oracle has a (redhat-based) linux distribution they'll sell you a support contract for these days.  Home users can go with something like MX if they want some hand-holding, or Manjaro if they're a bit more adventurous.  If they're purists they can go for Debian.  (If anyone has ever read Zelazny's "Chronicles of Amber" series, I've always thought of Debian as the linux universe's Pattern distro.)  There's a wide spectrum of distros for all sorts of users.  Maybe too many in some sense, but I'll take too many choices over too few any day.  It is good to have choices.

Link to comment
Share on other sites

18 hours ago, Geano said:

When did you switch to Linux, and what keeps you using it besides the fact it is open source?

I was there pretty much from the beginning.  I skipped the 386 versions because they were single-user and not generally useful as an OS unless you happened to be someone studying OSs.  (Like Linus was.)  But as soon as the 486 came along I was playing with it.

Long story version... Once Upon a Time, Way back in the twilight of my Commodore Amiga days :D I had an Amiga 3000 that was looking like it was going to be orphaned.  The A3000 was an awesome computer that had a 68030 CPU and (among other superpowers for computers at the time) a memory management unit.  The MMU allowed for protected-memory operating systems like UNIX to run on it.  Commodore even had a corporate bundle that included System V unix, though they sold their systems with their own AmigsOS to home users.

(This was in that moment of time when a lot of people thought unix was going to be the one true OS, even for the home.  Sun was ascendant.  Steve Jobs had left Apple and built his unix-based NeXT system.  Irix was making unix-based GFX workhorses for the emerging CGI/entertainment media industry.  Apollo was making another unix-based graphical workstation.  HP was selling unix workstations.  Even Microsoft had their Xenix systems.)

Well, the Chickenlips company did fold, but my computer still worked.  My migration options at the time were Windows 2.1 (on a 80386, a serious step down hardware-wise from my 68030), IBM's OS/2, that weird System 6/7/8/9 thing that post-Jobs Apple was pushing before MacOS X came along, or something else.  I went with NetBSD, mostly because I was using Sys V unix on the Suns at work and for BSDs "ports" system of source code, and bootstrapped the 68030 code onto my "old" A3k system.  Eventually that system became too slow (relatively speaking), but by then Intel's first "real" CPU, the 80486, was out and was cheap enough to put in home systems so I built myself a system with one and for grins tried this unix-like "linux" thing that everyone was talking about at the time.  It worked slightly differently than the unix I was used to at work and my old BSD system but it was basically the same if you squinted a bit, and it was a lot easier & quicker to install since it came on a bootable CD and I didn't have to download all the source through my 14k modem. :D Yggdrasil was the first linux distro I ever installed.  I've had at least one of my systems running some version of linux ever since.

 

I think the main reason I keep using open source is the shoulders-of-giants effect.  With open source you get the source.  You can do whatever you want, including examining and modifying things down as deep as you want to go... all the way to bare metal if you want to (or need to).  But you don't usually need to because there's generally a very good technical reason that things are the way they are.  (And it doesn't involve farming the users for cash.)  It is (usually) a meritocracy.  If anyone thinks they can do better then (they're probably wrong but) they can fork the code and go their own way.  If other people decide it actually is better then that becomes the way forward.  If not, then it doesn't.  Nobody gets forced down a path they don't want to take.  With a MacOS or Windows you get a kind of take it or leave it attitude.  You buy stuff that other people tell you you need.  You get rigid systems that (if you're lucky) do one thing well, but there's very few options for extending or modifying - or even studying or auditing - the code they deliver.  And there's a forced upgrade every time the companies' coffers get a bit low.

18 hours ago, leinwandname said:

We have one in our cellar, proudly displayed in my father's "man cave"

I’ve still got ours too, though I keep it mostly for the nostalgia factor.  I haven’t even turned it on in years.  Used to use it as test bed for digital electronics experiments but these days I mostly go for Microcontroller based options like the AVR chips in Arduinos.  More memory, faster, etc.

E411C73C-1EBD-48E8-A90F-067505AFDC0A.thumb.jpeg.2c7d384166a1a982b65b615a2c144e69.jpeg

Link to comment
Share on other sites

On 6/15/2020 at 11:08 PM, efaardvark said:

I was there pretty much from the beginning.  I skipped the 386 versions because they were single-user and not generally useful as an OS unless you happened to be someone studying OSs.  (Like Linus was.)  But as soon as the 486 came along I was playing with it.

Long story version... Once Upon a Time, Way back in the twilight of my Commodore Amiga days :D I had an Amiga 3000 that was looking like it was going to be orphaned.  The A3000 was an awesome computer that had a 68030 CPU and (among other superpowers for computers at the time) a memory management unit.  The MMU allowed for protected-memory operating systems like UNIX to run on it.  Commodore even had a corporate bundle that included System V unix, though they sold their systems with their own AmigsOS to home users.

(This was in that moment of time when a lot of people thought unix was going to be the one true OS, even for the home.  Sun was ascendant.  Steve Jobs had left Apple and built his unix-based NeXT system.  Irix was making unix-based GFX workhorses for the emerging CGI/entertainment media industry.  Apollo was making another unix-based graphical workstation.  HP was selling unix workstations.  Even Microsoft had their Xenix systems.)

Well, the Chickenlips company did fold, but my computer still worked.  My migration options at the time were Windows 2.1 (on a 80386, a serious step down hardware-wise from my 68030), IBM's OS/2, that weird System 6/7/8/9 thing that post-Jobs Apple was pushing before MacOS X came along, or something else.  I went with NetBSD, mostly because I was using Sys V unix on the Suns at work and for BSDs "ports" system of source code, and bootstrapped the 68030 code onto my "old" A3k system.  Eventually that system became too slow (relatively speaking), but by then Intel's first "real" CPU, the 80486, was out and was cheap enough to put in home systems so I built myself a system with one and for grins tried this unix-like "linux" thing that everyone was talking about at the time.  It worked slightly differently than the unix I was used to at work and my old BSD system but it was basically the same if you squinted a bit, and it was a lot easier & quicker to install since it came on a bootable CD and I didn't have to download all the source through my 14k modem. :D Yggdrasil was the first linux distro I ever installed.  I've had at least one of my systems running some version of linux ever since.

 

I think the main reason I keep using open source is the shoulders-of-giants effect.  With open source you get the source.  You can do whatever you want, including examining and modifying things down as deep as you want to go... all the way to bare metal if you want to (or need to).  But you don't usually need to because there's generally a very good technical reason that things are the way they are.  (And it doesn't involve farming the users for cash.)  It is (usually) a meritocracy.  If anyone thinks they can do better then (they're probably wrong but) they can fork the code and go their own way.  If other people decide it actually is better then that becomes the way forward.  If not, then it doesn't.  Nobody gets forced down a path they don't want to take.  With a MacOS or Windows you get a kind of take it or leave it attitude.  You buy stuff that other people tell you you need.  You get rigid systems that (if you're lucky) do one thing well, but there's very few options for extending or modifying - or even studying or auditing - the code they deliver.  And there's a forced upgrade every time the companies' coffers get a bit low.

I see very said, and rather convincing/ The whole open source vs closed source can be argument can be made for some things though I giess for tthe most part Windows has never bothered me. I have a strong Windows admin background on both their desktop as well as server OSs. I do thik tthe open source solutions for certain apps are a clear choice for sure thorough it just depends on what you really want to do. Its a bit of a trade off closed source only eyes on the code are who Microsoft permits for the deepest things though in the good spurt of their customers especially their business customers I'd think they are pretty secure with their product, and nothing is 100% secure no matter what it is. Trade off with open souse is your counting on the fact that theirs enough eyes with good faith on the code to prevent a huge risk of security. Though the thing with ciding is you don't often need to be an expert, or software engineer to spot an opportunity, and with open source its their in the open for everyone to study. Though I ggiess the otter side if that coin is it gets fast development even if it is sunsetted at some point there may still be small time modifications depending on the licences of course.

Edited by Geano
Link to comment
Share on other sites

On 6/7/2020 at 6:44 PM, efaardvark said:

Funny how Intel is saying benchmarks don't matter now that they're losing so badly to AMD.  :D

Seriously though... I hope Intel gets their act together soon.  Otherwise in a couple/few years we'll have AMD forcing us to pay insane prices for mediocre, incremental performance increases.  Don't get me wrong, I like AMD's current lineup and pricing.  My current system is all-in AMD, with a x470 motherboard, a 2700X CPU, and an "old" Radeon RX480 GPU.  I'll probably be replacing the RX480 when the RDNA2 boards come out, and I'm planning on upgrading the motherboard and CPU in a year or so to Ryzen 9 once DDR5 is out and reasonably priced.  (Or maybe even threadripper, depending on budget.)  I just have no illusions about how publicly-traded companies behave.  If AMD switches with Intel for the top spot and it takes Intel a few years to catch up then I have no doubts that we'll all eventually be cursing AMD's prices and lack of innovation as loudly as we were Intel's before Ryzen and Epyc came along.  Healthy competition is clearly a Good Thing for us buyers.

Intel is in trouble since they lost their cash cow which is Apple. They can come back on top? Yes, sure: five years ago AMD was considered essentially out of the game. Will they? Well, if they aggressively pursue R&D like AMD did...

Don’t underestimate the ability of ARM becoming a viable desktop platform for Windows and Linux in the near future: all that would take is someone designing a modular, user replaceable ARM combo instead the usual SOCs and voilà. Windows already can run on ARM. Risc might be the future (which would fulfill its goal, since it’s a more modern/efficient paradigm).

Link to comment
Share on other sites

44 minutes ago, mynameisjohncrichton said:

Intel is in trouble since they lost their cash cow which is Apple. They can come back on top? Yes, sure: five years ago AMD was considered essentially out of the game. Will they? Well, if they aggressively pursue R&D like AMD did...

Don’t underestimate the ability of ARM becoming a viable desktop platform for Windows and Linux in the near future: all that would take is someone designing a modular, user replaceable ARM combo instead the usual SOCs and voilà. Windows already can run on ARM. Risc might be the future (which would fulfill its goal, since it’s a more modern/efficient paradigm).

I would not say its that cut, and dry with ARM CPUs sweeping the market. It is true that for Mac, and certain distroes for Linux this may very be a good option for reduced power consumption among other things. Though for main stream computing the x86 IS has a long road ahead of it. Apple can get away with using ARM which is a RISK caricature becuse they are a closed ecosystem. Linux, and especially Windows is a every open ecosystem. Windows desktop OSs, and even their server OSs can technically run on basically any hardware including x86 based Mac hardware. I see this more of a typical Apple muscle flex move to further enclose their ecosystem tighter. 

With that said it is true a big reason Apple jump on the RISK bandwagon in particular with ARM is due to the steady system of exploits with Intel, and even AMD CPUs. 

Techinally Windows already supports ARM CPUs in the form of some tablet computers though I do nnot see them going to it exclusively. 

Link to comment
Share on other sites

Whether it is ARM or something else I think x86 as an ISA is going away at some point.  The licensing overhead is just too much to sustain.  IMHO something RISC-y is a better solution technically anyway, at least as long as general-purpose CPUs rely on electrons.  Heat limits mean that clocks can't be pushed any higher than around 5Ghz, so to get more compute power per "chip" (die) we need more cores.  CISC instruction length is a handicap in that situation.  It is also easier to design RISC CPUs hardware-wise.  ARM is in a good position to take over because it is already well-known on the server, mobile, and embedded side(s) of things.  Linux, BSD, and even Windows can already run on ARM processors.  Even ARM has licensing fees associated with it however.  Not nearly as much as x86, but enough that people already over-sensitized to such things from having to deal with x86 might over-react and consider starting from scratch to be a desirable situation.  That might give an opening for something like RISC-V.  There's already a debian linux port for RISC-V, and development/experimental hardware to run it.  We'll see.

  • Like 1
Link to comment
Share on other sites

3 hours ago, efaardvark said:

Whether it is ARM or something else I think x86 as an ISA is going away at some point.  The licensing overhead is just too much to sustain.  IMHO something RISC-y is a better solution technically anyway, at least as long as general-purpose CPUs rely on electrons.  Heat limits mean that clocks can't be pushed any higher than around 5Ghz, so to get more compute power per "chip" (die) we need more cores.  CISC instruction length is a handicap in that situation.  It is also easier to design RISC CPUs hardware-wise.  ARM is in a good position to take over because it is already well-known on the server, mobile, and embedded side(s) of things.  Linux, BSD, and even Windows can already run on ARM processors.  Even ARM has licensing fees associated with it however.  Not nearly as much as x86, but enough that people already over-sensitized to such things from having to deal with x86 might over-react and consider starting from scratch to be a desirable situation.  That might give an opening for something like RISC-V.  There's already a debian linux port for RISC-V, and development/experimental hardware to run it.  We'll see.

Even so it will certainly be a longer transition then a few years, and RISC CPUs compared side by side with CISC are technically not as capable at least for the time being. But in either case we'll just have to go where the industry leads us. 

Link to comment
Share on other sites

1 hour ago, Geano said:

Even so it will certainly be a longer transition then a few years, and RISC CPUs compared side by side with CISC are technically not as capable at least for the time being. But in either case we'll just have to go where the industry leads us. 

Really the only thing that CISC has going for it at the moment is the huge investment that the industry has made in the x86 ISA, mainly Windows.  In a lot of ways Intel + Microsoft have held the industry back in terms of effectively using newer technologies like RISC.  If you don't need any of that then RISC can take you into hardware performance territory that you simply can't get to with CISC architectures.  For mobile for example you really have to use something other than CISC for power reasons.  That's why virtually all phones and tablets use something like an ARM processor and not something from Intel or AMD.  Apple has reached the limits of x86 a long time ago, that's why it is going with its own ARM-based CPUs even in its desktop and laptops starting as soon as this year or next.  (It was supposed to be this year but got delayed due to the virus.)  All of Apple's phones, watches, pads, etc already use ARM-based processors.  Even MSFT tried to go ARM with their mobile/surface platform, though they kind of flubbed it.  Of course Unix based OSs were designed to be ported.  Linux for example can run almost anywhere.  Pretty much the ONLY place still requiring CISC is Windows on x86 desktops/laptops.

Edited by efaardvark
  • Winner 1
Link to comment
Share on other sites

20 hours ago, efaardvark said:

Really the only thing that CISC has going for it at the moment is the huge investment that the industry has made in the x86 ISA, mainly Windows.  In a lot of ways Intel + Microsoft have held the industry back in terms of effectively using newer technologies like RISC.  If you don't need any of that then RISC can take you into hardware performance territory that you simply can't get to with CISC architectures.  For mobile for example you really have to use something other than CISC for power reasons.  That's why virtually all phones and tablets use something like an ARM processor and not something from Intel or AMD.  Apple has reached the limits of x86 a long time ago, that's why it is going with its own ARM-based CPUs even in its desktop and laptops starting as soon as this year or next.  (It was supposed to be this year but got delayed due to the virus.)  All of Apple's phones, watches, pads, etc already use ARM-based processors.  Even MSFT tried to go ARM with their mobile/surface platform, though they kind of flubbed it.  Of course Unix based OSs were designed to be ported.  Linux for example can run almost anywhere.  Pretty much the ONLY place still requiring CISC is Windows on x86 desktops/laptops.

Yes I did do some reading between replies so to speak, and I do see it your point here. At the end of the day though the industry really will go with whatt is best. Who's to say a successor to x86 with a more complex ISA won't come along soon. 

Say speaking along the same limes I want to know your thoughts why have we not gone to 128-bit computing like when we jumped from 32 to 64 bit way back when?

Link to comment
Share on other sites

59 minutes ago, Geano said:

Say speaking along the same limes I want to know your thoughts why have we not gone to 128-bit computing like when we jumped from 32 to 64 bit way back when?

We have not gone to 128 bits because there is a cost for doing so and we don't really need it.  There's a whole bunch of reasons for the cost.  At the hardware level we use parallel buses.  That means that every bit gets it's own data line.  If two wires touch then the computer stops working.  If two wires even just get too close then there's crosstalk and your computer becomes unreliable.  Routing all those wires across the motherboard to connect the CPU, memory, PCI buses, etc. becomes a problem the wider the bus is.  An 8-bit bus is easy to design.  16 bits is also pretty easy.  32 bits starts to get troublesome.  64 bits is downright tricky.  This is also a problem for chip design internally, for similar reasons.

Ok, so why even go to 64 bits then?  Well, with 32 bits you can only count to 4 billion.  If you have a file that you want to reference a particular byte of data in then your file can only be 4 gigabytes in size.  If you have memory addresses that you want to reference then you can only have 4GBytes of memory and still be able to reference each byte individually.  Lots of people want to use files more than 4GB in size or have more than 4GB of memory in their computers.  Yes, there's tricks like using two 32-bit registers to hold a single 64-bit number, but now things like your math libraries and other code at the software level get complicated and slow.  Worst case they have to do twice as much work and run half as fast.  It is worth it to go to 64 bit buses and let the hardware do most of the work, even though it makes the hardware a bit harder to design and more expensive to build.

Going to 128 bits would make the hardware extremely hard to design and build, as well as expensive.  At the same time going from 64 bits to 128 bits on address buses and integers doesn't buy you nearly the gains that going from 32 to 64 bits did.  With 64 bits you can reference data in files that are up to 18,446,744,073,709,551,616 bytes in size, or individually address over 18 thousand terabytes of memory.  Very few people have files that big or computers with that much memory.  (yet.)  Maybe we'll get there some day, but for now it isn't worth the costs.

  • Cool (Kakkoii) 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...