Language:
switch to room list switch to menu My folders
Go to page: First ... 20 21 22 23 [24] 25 26 27 28 ... Last
[#] Wed Dec 16 2020 09:46:25 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

Close enough to L.A. and California that the coastal elites in San Jose and Los Angeles can hop in their private jets and be here in an hour and a half. 

You may be on to something. 

 

Wed Dec 16 2020 09:10:36 EST from Nurb432 @ Uncensored

Perhaps it really is the underground bunker, for the 'elite' when everything goes to pot next year.  Or where they are going to move the servers that house the copy Steve Jobs' brain, since California is getting pretty scary.. 

 

Or relocation of the Antarctic star-gate so the Russians dont get it first.

 

 



 



[#] Wed Dec 16 2020 11:30:35 EST from IGnatius T Foobar

[Reply] [ReplyQuoted] [Headers] [Print]

Meanwhile, as you like to point out - the web browser has replaced
the OS - A Chromebook is as effective as an i9 for what MOST people
are using their computing devices for these days - posting what they
had for dinner on Facebook. 

And as you pointed out in another room, you haven't been keeping up with enterprise architecture.

One thing that has really come into its own in the last few years is the Software-Defined Data Center (SDDC). Functions that have traditionally been offloaded to specialized hardware -- the routers and firewalls and fabric managers and storage controllers and other "big boxen" that you find in a data center -- all of them are moving to software. In fact, the project I am working on *right now* involves rolling out private cloud environments that consist of nothing except a rack of servers and a pair of ethernet switches.

This means your server CPUs aren't just sitting around waiting for the disk or the network anymore. They are *running* the disk and the network, in addition to the normal server workloads. To do this you need fast, high density, multicore CPUs, and I can assure you the chips you find in a Chromebook aren't going to cut it.

In short, there is *always* an appetite for beefier CPUs. The government isn't the only organization that needs them.

[#] Wed Dec 16 2020 12:27:05 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

No - I understand that - I think the situation is that low-cost, low power, lower performance ARM based CPUs have gotten good enough to migrate the vast majority of civilian technology platforms in that direction - and things like like Apple ARM illustrate that this performance is just going to scale. With clustering and distributed processing, and enough ARM CPUs... you'll be able to do what you're talking about, probably with a smaller physical and ecological footprint - with ARM. 

And, I never said Intel was going to get out of enterprise datacenter solutions. I think they're moving away from consumer platforms. I don't think that is any great conspiracy theory. They've competed less aggressively in this space for a while. They've given more ground to AMD. But a company that sees their core business diminishing doesn't continue to expand their operations, their headcount, their investments as aggressively as Intel is moving forward. They're seeing more competition in the corporate data-center too. 


But my guess is that when you get into NSA data centers in the middle of Utah and CIA and FBI computers that can crack modern encryption in a few days - it isn't ARM silicon driving those solutions. It probably isn't even what you would consider the flagship Intel Xeon processor. 

And... at times in Intel's past - I've seen and played with those systems - the ones that are top-secret prototypes for their consumer/enterprise grade line. I was playing with Itanium systems in our datacenter when the media was still speculating on what 64 bit systems would be like - and I know the guys with classified clearance were telling us, "those are just toys..." back then. It could have been just bravado - but I trust those guys. If they told me something was coming, it always eventually showed up. 

It is worth noting - that part of the idea of why you divide a technology into "consumer/civilian grade good enough," and "military-industrial specific design," can be explained by looking at the gun control debate. Limiting us to semi-automatic weapons and different standards of ammo and armor - different grades of optics - makes it difficult or impossible to outmatch the government as civilians.  If you're going to do that with firearms - why wouldn't you use it with the technology you depend on to predict the civilian habits, to monitor and control their action, to snoop on what they're saying, doing and watching? 

I think we're being driven into a box canyon. 







Wed Dec 16 2020 11:30:35 EST from IGnatius T Foobar @ Uncensored
Meanwhile, as you like to point out - the web browser has replaced
the OS - A Chromebook is as effective as an i9 for what MOST people
are using their computing devices for these days - posting what they
had for dinner on Facebook. 

And as you pointed out in another room, you haven't been keeping up with enterprise architecture.

One thing that has really come into its own in the last few years is the Software-Defined Data Center (SDDC). Functions that have traditionally been offloaded to specialized hardware -- the routers and firewalls and fabric managers and storage controllers and other "big boxen" that you find in a data center -- all of them are moving to software. In fact, the project I am working on *right now* involves rolling out private cloud environments that consist of nothing except a rack of servers and a pair of ethernet switches.

This means your server CPUs aren't just sitting around waiting for the disk or the network anymore. They are *running* the disk and the network, in addition to the normal server workloads. To do this you need fast, high density, multicore CPUs, and I can assure you the chips you find in a Chromebook aren't going to cut it.

In short, there is *always* an appetite for beefier CPUs. The government isn't the only organization that needs them.

 



[#] Wed Dec 16 2020 12:46:22 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]



[#] Thu Dec 17 2020 11:14:54 EST from DutchessMike

[Reply] [ReplyQuoted] [Headers] [Print]

The move to ARM for Apple was the end of my relationship with them.  They announced this two years ago and I moved my MBP to run Linux instead of OS X and I ordered a Thinkpad to replace it about 6 months later when I was comfortable enough to run Linux full-time for work.  Forcing everyone to support rebuilding their apps to run on ARM rather than X86_64 is a pain for commercial developers, but will throw a dent in a number of open-source projects that are based on C++ or any other language that you use for speed and direct access for hardware... Python, Ruby, Java, and other languages will survive with their runtimes ported.  I've already started down this road for Pi projects that I work on in my spare time, but anything I do for the desktop in the future will need to be cross-compile friendly, and that's a mess of toolchains I wouldn't wish on solo developers.



[#] Thu Dec 17 2020 12:14:03 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

WRONG About M1 Mac Speed — Apple Silicon Explained!


What do you think, fact or fanboi fantasy? 

https://www.youtube.com/watch?v=vg0AF166eVI&t=709s




[#] Thu Dec 17 2020 12:30:17 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

I never bought in, so I won't ever have to sell out. 

I buy Macs as curiosity, and so that I have a passing experience with them when people tap me for assistance - and because I just have a general interest in OS platforms - what they do better, what they do worse. 

The .dmg package is an interesting install process - and it shows where commercial vendors just have a huge advantage over the FOSS community and its competing Yum and Apt package management solutions. 

I love some of the features of OS X, like how maximizing a window instantly sends it to its own virtual desktop, and how gestures so effectively switch you between those desktops. 

I love how transparently and user-friendly they make their e-mail client setup. 

And their overall aesthetic design is very pleasant. 

But, over the last week, I found the machine was spiking all the time from a normal operating temp between about 88 at idle and 143 max to averaging more like 120 idle, 140 average, with spikes to 160 or higher. This is a MBP 2011 with the Radeon GPU bug - so heat is the mortal enemy of this model. 

I repositioned the laptop chiller it is on, I considered opening it up and replacing the thermal grease... 


But then I looked at Activity Monitor - and the Safari Bookmark Sync Manager was consuming 80% of the CPU. 

Google told me this is a frequent issue. 

I discovered that for some reason when the bookmark sync issue was happening, the Mac was switching from the integrated Intel HD GPU to the Radeon. 

I got that resolved, and temps dropped back down to where they were. 

Apple being able to remotely cook my old MBP with their bookmark manager bothers me. 


Thu Dec 17 2020 11:14:54 EST from DutchessMike @ Uncensored

The move to ARM for Apple was the end of my relationship with them.  



[#] Fri Dec 18 2020 21:14:17 EST from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

Ran across a story tonight that Microsoft has been working with NVIDIA for a bit now, with end goal to make their own chips instead of getting them from Intel, much like Apple.  Mainly for their data centers and are not going to come out with a line of branded PCs. ( nothing new, as NVIDIA had their own IP licensed from ARM for some time now, not just their own GPU stuff.  I guess they are using RISC-V too, embedded into some of their GPUs as controllers of some sort )

And AMD is working on some ARM designs of their own for the consumer market.

Looks like Intel may be on the wrong side of progress in the coming decade.. ( not sure what ever happens to their 'strong-arm' line, which was licensed ARM IP, thru DEC or some such back room nonsense  )



[#] Fri Dec 18 2020 22:29:14 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

This could also be evidence of my claim. 

It doesn't make sense that Intel is expanding so rapidly and building so aggressively if rumors like this are easily accessible to people like you and I. 

I mean... it *could* just be denial. But I worked in the company - and that isn't their corporate culture at all. They give you a book called, "Only the Paranoid Survive," the first day you join, and they're serious about you reading it. It is like their bible. 

Their biggest wins have not been in consumer or corporate markets for a decade. I knew about the football "instant replay 360" thing a half a decade before I saw it on the NFL. I couldn't disclose, not because of any NDIS I was still under, but because the trusted source who told me all about it was under NDIS - and one betrayal would have cut me off from lots of other insider information I could leak on Tech Republic. 

So... the thing is - that was SUPPOSED to be interactive. NFL Live, or whatever the cable thing was - you were supposed to be able to... any consumer - could freeze any point in the game, and instantly see a CGI view in 360 degrees in a dome around the point of interest. 

Their Super Bowl drones were also something I had the drop on about a year and a half before they did their thing. 

And both of these things are commercial application of military grade technology. 

Imagine having a tactical real time 360 degree view of a battlefield. If those drones can blow you away at the half-time show - imagine what they're capable of in a combat theater. 

Imagine how those two technologies could work *together* in a military campaign. 


Here are some additional things. 

Intel was once the World's Largest Semiconductor Manufacturer. There were still people with license plate frames that bragged about this when I started. Taiwan ate their lunch, and they became the world's biggest consumer CPU manufacturer. 

They liked being a commodity component supplier selling supply line products to companies that assembled consumer products. They *hated* being a consumer manufacturing company. 

They refused to ever let their brand be put on autos, because they thought the automotive industry's poor reputation with consumers would tarnish their brand. Lots of cars had "Intel Inside," and all the big 3, and Japan, and Germany all wanted to put Intel Inside logos on their cars, and Intel would *never* let that happen. Not even on a Benz or Mercedes. 

The Pentium Divide by Zero flaw was a MAJOR trauma for the company. They spent months dealing directly with end-user consumers answering questions. Listening to veterans describe it - was like listening to Viet Nam vets talk about the Tet offensive. Seriously. 


2010 was a watershed moment for them, when ARM was coming up and traditional paradigms of consumer computing were changing radically. 

The more you guys work through it with me - the more I am certain - Intel isn't in trouble. They're abandoning consumer and corporate technology markets for the most part. 

But they're not closing up shop. 





Fri Dec 18 2020 21:14:17 EST from Nurb432 @ Uncensored

Ran across a story tonight that Microsoft has been working with NVIDIA for a bit now, with end goal to make their own chips instead of getting them from Intel, much like Apple.  Mainly for their data centers and are not going to come out with a line of branded PCs. ( nothing new, as NVIDIA had their own IP licensed from ARM for some time now, not just their own GPU stuff.  I guess they are using RISC-V too, embedded into some of their GPUs as controllers of some sort )

And AMD is working on some ARM designs of their own for the consumer market.

Looks like Intel may be on the wrong side of progress in the coming decade.. ( not sure what ever happens to their 'strong-arm' line, which was licensed ARM IP, thru DEC or some such back room nonsense  )



 



[#] Fri Dec 18 2020 22:31:08 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

What I'm saying is - you guys are all going, "We finally defeated one of the great Satans..." 

And I'm saying... "No... you made it more evil." 

 



[#] Sat Dec 19 2020 08:08:11 EST from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

I dont think id go so far as saying they are "defeated", just they are going to take a hit.  Unless as you say, they have a 'plan'  ( unlike the Cylons.. "plan", my ass ). Google, Amazon, and Microsoft all looking elsewhere for their data centers, and Apple now for their desktops, it does mean some loss of revenue.   Mobile market is already gone.

And while i totally agree its nothing to sneeze at for income, military only goes so far, only takes one public breach for them to turn on you. Or you piss off the wrong person out in DC by accident one afternoon, or dont pay off the right one. 

Fri Dec 18 2020 22:31:08 EST from ParanoidDelusions @ Uncensored

What I'm saying is - you guys are all going, "We finally defeated one of the great Satans..." 

And I'm saying... "No... you made it more evil." 

 



 



[#] Sat Dec 19 2020 14:21:30 EST from IGnatius T Foobar

[Reply] [ReplyQuoted] [Headers] [Print]

Ran across a story tonight that Microsoft has been working with
NVIDIA for a bit now, with end goal to make their own chips instead
of getting them from Intel, much like Apple.  Mainly for their data
centers and are not going to come out with a line of branded PCs. (

I heard the same thing ... might have been in one of the Linus Media Group channels, I think. They made it a point to observe that the Microsoft ARM effort is being run by the division that runs Azure, not the division that runs Surface. This would of course imply that it's a data center focused effort.

Go ahead and believe what you want, but I'm calling this one a simple effect of commoditization. Intel got addicted to the revenue from big-margin CPU sales. Frankly, the commoditization of general purpose CPUs is *decades* overdue. As is usually the case with these things, if you don't cannibalize your own business, someone else will.

I'm going to just sit back and enjoy the show, knowing that my Linux workloads will run on pretty much anything.

[#] Sun Dec 20 2020 11:15:11 EST from DutchessMike

[Reply] [ReplyQuoted] [Headers] [Print]

I don't see it as selling out - I used their tools to practice my craft (I write software and do networking/cyber-security nonsense for my clients) because they were better than their competitors (Windoze).  Now that they've taken a turn in a different direction, I wish them well - I'm sure there are legions of people doing other things or who simply want acceptance from their peers who are happy to continue using OS X and Mac products at any price.  OS X was neat for a while since they would use intel hardware and provide a BSD-like OS with a nice looking gui.  Sure, my Linux setup doesn't look as cool as OS X does, but my computer (with better hardware) does what I need it to do and I spent significantly less to get it up and running compared to buying another MBP.  I also "purchased" the freedom for issues like the one you described with the CPU usage.  My wife uses my old MBP which runs fine...except when it's 2 AM and I hear the fans on overdrive because of the same issue.

I have a project in 2021 that will need to be deployed on iOS and likely another that will need an app ported to OS X.  I'll begrudgingly buy a used mac mini solely for the purposes of running XCode and shipping builds for acceptance and deployment.  It will be far easier for me to spend a couple of hundred bucks on a machine to deal with Apple's ever-changing idiosyncrasies as part of the "hardware" toolchain than it would be for me to move everything over to a Mac for development.

Mike

Thu Dec 17 2020 12:30:17 EST from ParanoidDelusions @ Uncensored

I never bought in, so I won't ever have to sell out. 

 



[#] Sun Dec 20 2020 22:49:24 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

"because they were better than their competitors (Windoze)"

I refer back to Ig's earlier comments. They're all the same. There are little differences - and those may make one choice better or worse relative to your particular use model on a personal level. But, they've all got warts, too. 

Linux is kinda like a butterface in this regard - you've got to love her for her personality, or that thing she does for you with her tongue - whatever it is that keeps you coming back -  in this regard.

I'm not judging. Someone has to give the ugly girls love too. I appreciate that you Linux guys stepped forward and took that bullet for the rest of us over here with our pretty, shallow and dumb OSes. 

;) 

 I get why OS X people pay so much more for the same hardware only hardware crippled by thermal limitations. I get why Linux people get so happy that they can get a full fledged modern OS running on a 386 they pulled from the back of their local computer store and repaired with conductive glue and YouTube videos then hooked up to a 250 GB FreeNAS attached drive they hacked out of an old Power Edge array of 500mb SCSI ultra-wide drives their shop was sending to the recycling center... 


And I get why Windows people don't want a regular Intel PC with a Nvidia GPU and a Microsoft OS on it, too. 

They're all really good at what they do best, and they all suck at things the other guy does better than they do. 




Sun Dec 20 2020 11:15:11 EST from DutchessMike @ Uncensored

I don't see it as selling out - I used their tools to practice my craft (I write software and do networking/cyber-security nonsense for my clients) because they were better than their competitors (Windoze).  Now that they've taken a turn in a different direction, I wish them well - I'm sure there are legions of people doing other things or who simply want acceptance from their peers who are happy to continue using OS X and Mac products at any price.  OS X was neat for a while since they would use intel hardware and provide a BSD-like OS with a nice looking gui.  Sure, my Linux setup doesn't look as cool as OS X does, but my computer (with better hardware) does what I need it to do and I spent significantly less to get it up and running compared to buying another MBP.  I also "purchased" the freedom for issues like the one you described with the CPU usage.  My wife uses my old MBP which runs fine...except when it's 2 AM and I hear the fans on overdrive because of the same issue.

I have a project in 2021 that will need to be deployed on iOS and likely another that will need an app ported to OS X.  I'll begrudgingly buy a used mac mini solely for the purposes of running XCode and shipping builds for acceptance and deployment.  It will be far easier for me to spend a couple of hundred bucks on a machine to deal with Apple's ever-changing idiosyncrasies as part of the "hardware" toolchain than it would be for me to move everything over to a Mac for development.

Mike

Thu Dec 17 2020 12:30:17 EST from ParanoidDelusions @ Uncensored

I never bought in, so I won't ever have to sell out. 

 



 



[#] Sun Dec 20 2020 22:50:49 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

And I get why Windows people don't want a regular Intel PC with a Nvidia GPU and a Microsoft OS on it, too. 

Sun Dec 20 2020 22:49:24 EST from ParanoidDelusions @ Uncensored

 



[#] Mon Dec 21 2020 11:01:11 EST from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

If you are doing AI or GPU mining, you pretty much need NVIDIA at this point. its why i have several. 

 

And i donno, if you are pulling old PCs out of the dumpsters you are better off with NetBSD.   Modern GUIs on Linux ( or FreeBSD.. ) suck resources too, just like Apple's or Microsoft's. ( its obscene really ).  But to be fair, on 1:1 hardware, *nix runs faster then windows, as there are far fewer background process eating away at things. I think OSX is better in this regard too.  Its even worse if you are on a corporate network with all its overhead.



[#] Mon Dec 21 2020 12:51:35 EST from nonservator

[Reply] [ReplyQuoted] [Headers] [Print]

>Ampere Altra review: 2x 80 cores Arm server performance monster

Top comment:

"Once again, I am excited to see alternative systems.

"Once again, I am disappointed to see I cannot actually buy one."

Same old, same old.

Every day I am amazed at how hard people make it for you to give them money.



[#] Mon Dec 21 2020 13:17:59 EST from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

If you are giving out money, ill take it.

 

:) 

Mon Dec 21 2020 12:51:35 EST from nonservator @ Uncensored

 

Every day I am amazed at how hard people make it for you to give them money.



 



[#] Mon Dec 21 2020 14:07:41 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

"Faster" is such an ambiguous and relative statement in computer processing, in most cases. 

Debian BOOTS slow as dogshit. It isn't just that you can see the entire boot process - there is a BUNCH going on. Today I booted a Windows PC (i7 water cooled desktop workstation with 32GB of RAM and Nvidia GTX1080 and RTX 2080 GPUs, 1TB SSD boot drive), a MBP 2012 i7 (500mb SSD and 16gb) and my Surface Pro (500mb SSD and 16gb). The Tower beat them all, the Mac beat the Surface Pro. 

But a lot of that comes down to my use models for the different machines. My Surface is my daily driver and loading a lot of things at startup. 

Then in real world performance - Macs have often claimed to beat equivalent PCs from competing manufacturers running Windows - but the reality has often been that thermal throttling prevents Macs from hitting their real world peaks for sustained periods of time - even if there is something about the custom PCB design and bus engineering that makes them technically "faster" than a reference design PC board. 

With Linux, traditionally - faster didn't mean much - if you couldn't do the things you wanted to do that require that speed - like fast 3D gaming, 3D rendering, photoediting using industry standard tools like Photoshop. 

"But muh GiMP iZ FuhR33!" No one cares in mainstream graphic design. Gimp and Blender are difficult, unpolished, poorly documented, and not really dominant in the industries that use tools like this. "I can run Illustrator in WINE!" Sure you can. Graphic illustrators don't want to mess with those hassles. 

I understand that there is some new way of playing Windows games on Linux and "some games even run better on Linux than on native Windows." 

I have my doubts about this. We've been hearing this for years - the fact is, if you want a Windows gaming PC - your best bet is, and remains, to do it on a Windows machine. Most gamers are gamers, not OS fans, and will continue to do it the most straight forward, accessible and easily supportable way possible - on a Windows machine. The only people who want to do it on Linux are *Linux* fans first, and gamers second - willing to make sacrifices on gaming for the principle. 


I guess if Linux could actually give a significant and meaningful performance increase - pro and semi-pro gamers might abandon Windows in that, "anything for a competitive edge," kind of sports philosophy. 

So you get back to the demographics of who uses what, and how which tribe you are determines which platform you see as superior. Developers and coders tend toward Linux and OS X. People who compile from source like *nix based OSes. That is a fraction of the user base for personal computers. "Dumb Lusers" prefer Mac - because they like the appliance like simplicity of the OS and bundled suite of applications. Windows remains the middle ground that does the most for the most users - all the right compromises. The grocery getter SUV of computer OSes. Best is *relative* to what your goal is. 

But there is nothing wrong with being a "corner case" use model - either - and feeling that your choices are "better" or smarter or more well informed than the middle of the road probably isn't unjustified. 




Mon Dec 21 2020 11:01:11 EST from Nurb432 @ Uncensored

If you are doing AI or GPU mining, you pretty much need NVIDIA at this point. its why i have several. 

 

And i donno, if you are pulling old PCs out of the dumpsters you are better off with NetBSD.   Modern GUIs on Linux ( or FreeBSD.. ) suck resources too, just like Apple's or Microsoft's. ( its obscene really ).  But to be fair, on 1:1 hardware, *nix runs faster then windows, as there are far fewer background process eating away at things. I think OSX is better in this regard too.  Its even worse if you are on a corporate network with all its overhead.



 



[#] Mon Dec 21 2020 14:31:49 EST from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

Faster as in response time doing similar functions. Of course in many cases its not the *same* application ( see below ), but functionality you can compare so its not quite apples and oranges. Sure, that speed is influenced by what is running under the surface, but if an OS is running a bunch of nonsense just to keep it running, yes i'm going to blame it. GUI toolkits also come into play, but as an 'overall user 'feel' you can compare them.

Boot times? Well, i can say on same hardware for me, Debian boots noticeably faster than windows. Have i measured it? No as to be honest, boot time on modern machines is not a factor to me, a few seconds there either way, i dont care. Its when it gets into minutes of waiting to login like windows can when its in a bad mood, then i care.

And yes for the most part applications are different and 'work a likes' across platforms, with a few exceptions, such as LibreOffice, blender, spyder ( i do python ), chrome, FreeCAD, PostGreSQL, and a few others. But i do tend to see faster "times" with Linux/BSD than windows/OSx. Earthshaking? Perhaps not, but a noticeable amount to me at least.

GPU performance, that depends on who's drivers you use. Opensource NVIDIA for example, they work well enough for video and such, but wont break any records for gaming.  OEM NVIDIA, work as well as a windows drivers. ( i'm sort of forced to use them on my NVIDIA AI boxes ) Mali drivers, same sort of situation. Intel, i cant tell a difference really between OSS and commercial. Perhaps INTEL actually opensourced theirs, i donno, never cared enough o look.

I think there is more than just speed issues keeping people from porting games away from windows and more about not wanting to making 2 code bases, just for a smaller market. Just my feeling, but not being in that world i cant talk intelligently about it.   Pretty sure games use a lot of OS tricks,  so it woudl be a pain to support 2 when there isn't large *paying* user base. 

Blender? All ill say is i have a been a fan boi even back when it was still a commercial product back in the early 90s, and being used by NAN for their business, and you had to pay to play. ( I still have my license, it was a work of art. And my signed manual... ) Is it hard to use? Thats relative, most 3D tools at that level have their own learning curve. 

I cant comment personally on the gimp/Photoshop thing. I know professionals who 'get by' out of principle after Adobe going cloud, but not sure if its at the same level at this point in time. But they get their jobs done, so does it matter?

 

That is all i got for a Monday :) 

 



Go to page: First ... 20 21 22 23 [24] 25 26 27 28 ... Last