Archived posting to the Leica Users Group, 2004/08/11
[Author Prev] [Author Next] [Thread Prev] [Thread Next] [Author Index] [Topic Index] [Home] [Search]This is certainly true in many parts of the computer technology industry, most clearly in consumer computing, but also in "industrial" equipment to a lesser degree. BTW, this does back to cameras and Leicas, so bear with me. I came up in the 80's - Apple IIs, Macs and PCs at the personal and small business level. Then professionally with Sun gear running Sybase RDBMS, Windows in its various incarnations, and finally for the last 7 years, mostly Linux often running the Oracle RDBMS. Folks like me target Linux because it's a less expensive way to get what Sun has traditionallly offered - an extremely stable set of industrial tools that do not change over decades - literally (let's not get into SunOS vs. System V). I'm talking the software here, for the moment. There may be bug fixes and a slow stream of enhancements, but complete backward compatibility (hence extensive testing of any new code) is put at a huge, huge premium. Some of the world's favorite Web sites run on Unix, Linux or other free Unix variants (FreeBSD): Google and Yahoo, for example. And for good reason. I have customers who have not rebooted their servers for over 18 months! I'm a great admirer of Microsoft. They did their customers well during the 80's and early 90's. Hell, there still is alot of value in a copy of Windows XP. Windows NT was designed by none other than Dave Cutler, the architect of the famed industrial strength VAX VMS operating system (hardcore stuff). The problem is the economics of consumer technology. Cheap always wins out over quality. At least in the mid-90's most of the famed "Blue Screens of Death" were caused by errors in the add-in device driver software that make the modems, video cards and other peripherals go. A big, big % of the cost of making a video card is making the software drivers. But to get cheap stuff out the door quickly and into consumers hands for $69, well, "scimping" on software quality and testing was the order of the day. So when we buy our BestBuy HP wonder computers, they're filled with cheap peripherals with really crappy, hastily written software drivers. Hence our "boo hoos" when our Windows boxes crash and burn. No surprise, but go buy one of nVidia's higher end $1,600 video cards, a dual Opteron mother board, an industrial Gigabit ethernet card, a dual redundant power suppy, a CAD engineer grade optical mouse, a 3Ware SCSI RAID hard drive controller card - you get the idea - and then driving your Windows PC might bee almost like driving a Leica M3 with a fresh CLA :-) For me, I use mostly Linux (personally, not just professional systems work) . Religious backwards compatibility (with just a tiny few all-to-human exceptions). The peripheral drivers are very good, and if they're not, no one lies about it. It's all out in the open. No surprise, but the professional grade components often get good, solid drivers first. So I might buy some cheap consumer electroncs - a dvd player or a palm pilot. But for me, I think of it like buying some really nice imported goat cheese - it will be tasty as long as it lasts. Cheap electronics are consumables. Fortunately, there is a large industrial market for high grade computer equipment. High quality, excellent support for years and years to come. You to can have it for personal use if you are willing to pay. But I'm a photo newbie, and I'm not so sure about the emerging economic dynamics of the camera market. From what I hear on this list and elsewhere, major changes are underway. Is there a healthy market for "industrial grade" camera gear? Maybe Mamiya RZ stuff for "industrial" studio stuff. I don't really know. Scott Lee England wrote: >Regarding the comment "but for my own inner, personal ambitions I seek a >medium that will change little over time." > >I recall an article by a writer for, I believe, the New Yorker whose premise >was that technological change was coming too quickly for even the >manufacturers to master. In the old days some new innovation, say automatic >transmissions, would last long enough for manufacturers to perfect and for >users to master, until finally the product was near flawless and void of >quirks and bugs. Now, computers and software are not around long enough for >that process to occur. Computers freeze, crash, have glitches, and this is >regarded as common and with no one able to figure out what's wrong. >Software is the same--new products come out with patches following in weeks. >Bugs are common, followed by products having new bugs. Mastering the use of >these products becomes difficult and may affect its use for artistic >purposes. The people I've seen buying digital cameras are already >considering buying the new, improved versions. Of course, there will be a >new learning process. > >Lee England >Natchez, Miss. > >_______________________________________________ >Leica Users Group. >See http://leica-users.org/mailman/listinfo/lug for more information > >