dewme
About
- Username
- dewme
- Joined
- Visits
- 930
- Last Active
- Roles
- member
- Points
- 15,783
- Badges
- 2
- Posts
- 6,106
Reactions
-
Macintosh launched on Jan 24, 1984 and changed the world -- eventually
I distinctly remember using the first Mac for the first time. Where I worked all of the engineering documents were hand written by engineers like myself and committed to typewritten form by dedicated “technical writers.” They used DECmate word processor machines with VT220 terminals so all edits had to go through the hand written to typed loop. It was painful. This was in the mid to late 1980s and we were literally doing copy & paste using paper and paste both for editing and inserting hand drawn or mechanical writer generated diagramsAt some point the company gave us exactly one Macintosh computer to share with the entire group of 20-40 people including managers, engineers, and proposal writers. This machine was seen as exotic and was exclusively used for drawings and more so for creating presentations in Presenter/PowerPoint. Of course all of these presentations were printed on vellum paper for use with overhead projectors.It all seems so primitive by today’s standards. But in retrospect one of the things that advanced personal computers did in the business world was to eliminate a lot of intermediaries like tech writers and administrative assistants. The tasks held by those roles and others were folded into the engineering and business roles so now you have engineers being concerned about things like document formatting, fonts, and putting together snazzy presentations.On the surface this added responsibility may sound like a trivial distraction, but on a human level it helped professionals gain a better understanding of how to communicate and connect with people outside of their technical enclave more easily. Being technically astute is great, but if you can’t share your discoveries and great ideas easily with the rest of the community in which you live and work your efforts may be wasted.The Macintosh made it easier for professionals and people in general to express themselves in a faster, more fluid, and communicative manner by decreasing the time and effort required to move ideas from their source (brain) to an audience without losing as much in the traditional translation layers. I could put together both the written and graphical content myself and move things along more quickly.The Macintosh, regardless of its sales numbers, established a new standard for human-personal computer interaction. The PC world eventually caught up, but it was a laborious transition to bridge the text-based to graphical-based human interaction model, something Apple nailed right out of the gate with the Macintosh.The Macintosh’s impact on how personal computers relate to their users is far greater than the impact of the Macintosh machine itself. The Macintosh wasn’t the first expression of the new way to do personal computing but it was the first machine that brought the new model to the masses in a compelling and relatable form. And the beat goes on to the present. -
M4 Mac mini review three months later: the perfect headless Mac
I've been having all kinds of problems getting M2 Pro Mac mini to connect to my 5GHz WiFi. Tried different channels, tried different WiFi settings, contacted Apple support, etc., and nothing solved the problem. For the sake of expediency I settled on using using it with 2.4 GHz channels. Fast forward a few months and I decided to give it another shot today by trying moving or reorienting the Mac mini. I started off with the intention to flip it on its side, but I didn't have a good way to secure it, which would not be an issue with the new Mac mini. I started off by simply raising the Mac mini on a an upside down wooden organizer box in the same exact location where it has been sitting all along and ... poof ... it connected instantly to my 5 GHz WiFi and with very good signal strength.
Moving the Mac mini a mere 2 inches higher off the desk solved my problem. I do understand the directionality issues with increasing radio frequencies. I figured with the closest access point being only 20 feet away and line of sight to my office chair at about a 30 degree incline, connecting would not be an issue. In my case I don't think it was physical obstructions causing issues, it was the the electromagnetic interference from the surrounding devices, including a monitor and dock causing electromagnetic interference, probably from the higher harmonics from things like USB3/4, TB, DP, HDMI, etc., communication ports, switching power supplies, etc.
Is this vulnerability a design issue? Hard to say. It's up to decide for yourself, but I have a Mini PC sitting right next to the Mac mini and I've never had any 5 GHz WiFi issues at the same exact location. It is a consequence of where the designer placed the antenna inside the chassis and the RF environment in my space. I'm not losing any sleep over this. Just be aware of how sensitive these tiny devices are to placement and orientation, especially with higher frequency channels. I would not be surprised at all if 6 GHz WiFi is even more sensitive. -
New UK ID app yet again fumbles tech that Apple has already perfected
avon b7 said:It is logical for governments to want to offer solutions without being tied to third parties ...
... <cut to save space>
In spite of all the potential negatives the desire to produce a homegrown solution is completely understandable.Are you trying to confuse us with logic again? What you're saying makes complete sense. One could write a thousand similar articles lamenting why a government, or even private businesses, took it upon themselves to do what outsiders would casually classify as "reinventing the wheel."This even happens within businesses where more than one group is trying to solve the same problem that another inside group is already working on or has already solved. Some of it is a case of "not invented here" (NIH) syndrome which seems to occur too often in engineering organizations. I wish I could say that it is a rare occurrence, but in my experience, it is embarrassingly common ... sigh.I believe a lot of these situations occur because different people, teams, companies, business sectors, local, state, federal governments, countries, etc., often start out with the mentality of: "We have to solve this problem, "We need to own the solution to this problem," "We need to control the solution," and of course the cherry on top of the sundae, "We need to get credit for solving this problem." This mentality is especially effective when applying tunnel vision as a primary execution strategy.As you said, there are of course pragmatic considerations behind a lot of these sort of things. I saw the same kind of thing happen when I worked in the defense contracting businesses. At one point the government wanted to control every aspect of what got built down to the specification of individual electrical and electronic components and even programming languages. They also required up-front acquisition of spares for the expected service life of the systems. Everything was one-off and had to meet stringent mil-spec requirements with testing out the wazoo. Claims of "$1000 hammers" were actually true because that's what it cost when you add up all the required testing, certification, and documentation.Ultimately these systems became exorbitantly expense to acquire and maintain. Cost and schedule overruns were the norm. Like any system based on a snapshot of technology when a project started, and with long lead times, most of the systems became obsolete or very inefficient shortly into their service life.The response to these issues led to a lot of systems being redesigned and new systems being built using commercial off-the-shelf (COTS) components. I worked on one of the very first COTS redesigns while the project was still in development and on this small project the contract was cut in half simply by using COTS components and subsystems.Sounds like a great solution ... until a few years or so later when the COTS components could no longer be maintained because they were no longer commercially viable for the manufacturer to produce. I ran into similar issues with non-military related systems and products using commercial off-the-shelf components. Simple things like memory chips became unavailable because they were too small to justify any manufacturer to produce them.Many times these supplier issues were partially caused by the quest to limit component costs on high-volume or low-margin products. If you can save $0.50 on a memory chip by buying the absolute minimum size that works, that adds up. But it also makes it more likely that you're going to run into supplier issues down the line. But that's someone else's problem, not yours. -
Apple won't return TikTok to the App Store until it's sold to a U.S. buyer
mikethemartian said:After World War I the U.S. Navy with support from President Woodrow Wilson forced the British Marconi Company to sell its U.S. subsidiary (Marconi Wireless Telegraph Company of America) to American company General Electric. -
Asus ProArt Display 5K review: 27-inch Retina for a bargain
gatorguy said:snookie said:gatorguy said:braytonak said:I have to disagree with such a favorable rating.Backlight banding when auto brightness is enabled? Unusable HDR over USB-C? Those two points alone should barely earn 4.Add in the unlabeled controls (unless the labels appear on-screen,) cheap plastic, and speakers that are most likely only good enough for notifications to really make it a pass-with-a-push.I know, the point is the display itself. I’ve waited almost ten years for a decent 5K. Maybe the next Studio Display will be the sweet spot.
Speakers on a monitor are for users who don't care all that much about sound quality. Good enough for a few Youtube videos but not for great music. So that's not a ProArt fail IMO.
Banding with auto-dim only on a very pale background is another non-issue as far as I'm concerned. I have my monitors in the same place under the same lighting 24/7, and all of them with hoods to minimize stray light. Why would I want persistent auto-dim? My major objection is it changes my perceived color tones. That's the same reason I turn off auto-dim on my SmartTV. I would understand if the light was constantly in flux, like with my smartphone wanderings. For the limited want to dim my computer monitor. manually doing so once or twice a week as desired (or in my case never) isn't problematic. Again, not an Asus fail.
Controls? Yes they are labeled on screen, but I will agree I'm no fan of how Asus incorporates the controls. It could be improved. Fortunately that's another feature that's rarely touched after setup. Dimming can be accomplished with a hot key.
Plastic exterior components? They're not hammers so what's the issue with a lighter build. You set it up and forget it. My current Asus monitors are solid.
Built in KVM works for me. Both systems that use my Asus monitors are set up with two computers/one monitor, one for design and one for RIP, and we go back and forth regularly each day. Mine are set up with separate KVM's, another box to make room for. Built-in = better.
For color accuracy and system compatibility, I could not be happier, and mine are only 4K. The value of this 5K monitor far outweighs the out-of-pocket cost. "You can buy two and still have money left over versus buying Apple's expensive Studio Display" means a whole lot more to me than whether a monitor frame is plastic.
The Asus has the KVM built-in.
I don't need "better than most monitor speakers" when working. If high-quality sound is important to YOU, even $150 will buy you far more depth, range, and volume than a Studio Display could produce, correct?
I also don't want a camera with its potential privacy issues on any monitor of mine, and never have in my 40 years of computer use. A separate and distinct camera component is used as needed and then removed. YMMV and apparently does.
As others have mentioned, saving $100's with the Asus might mean being able to afford a higher spec'd Apple computer,which would deliver far more tangible benefits.