2022 Working Environment
The change of the year is always a good time to reflect. This year I’ve made major changes in my physical environment by reshaping many of the things about this house we recently moved to in Knoxville. Besides ripping out the entire kitchen, replacing all the floors, and reworking the fireplace, it was a good chance to rethink the office I work in every day. I’m rather persnickety about the lighting, layout, and tools I use (although a lot of people still think I’m crazy for using fairly standard tools, like Word, for writing).
This is my space, pretty much—
I use an adjustable height desk where I’m either leaning or standing—if I want to sit to read something, I normally grab a tablet and sit in the red chair off to the side, or even go someplace else in the house. I prefer not to read on my main computer screen most of the time. I normally keep ambient light to a minimum, and turn my monitor brightness down to pretty minimal, as well—below 20%.
I’m currently running an LG 38in curved monitor. I don’t game, so I care a lot more about resolution than refresh rate, etc. My main driver is a Microsoft Surface 8, topped out in specs, with a thunderbolt dock to support all the externals. I’m typing on a Drop ALT with 68g Zilentv2 switches. The smaller keyboard keeps the Wacom pad close by, making it easier to switch between keyboard and pointer. Smaller keyboards like this are perfectly useable if you map all the function and other special purpose keys onto a separate layer, and then place your layer control keys wisely. I’ve been thinking about switching to a more ergonomic keyboard, but I’ve not made my mind up yet.
For audio and video gear above the monitor I used two desk-clamp photography stands on the back of the desk, along with a long cheesebar. The cheesebar holds the Logitech webcam connected to my work machine, which is off to the side, the Dell Ultrasharp 4k, the ball mount for a digital camera (for recordings), and an AT4053 shotgun mic. On the side of the desk is a boom arm with a Blue Baby Bottle mic.
The two mics feed into an Antelope Zen Go interface, which allows me to do some minor eq and such before my voice hits the computer. I used to do all this onboard the computer itself using a Focusrite Clarett, but its a lot simpler to push some audio processing onto the interface itself with the Zen Go. These kinds of DSP-onboard interfaces tend to be hard to get up and running, by the way. I worked with an Apollo interface for a solid month before giving up and switching to the Zen Go.
Beside the Zen is a little Tascam recorder; the primary mic is routed through the Zen to this recorder so I don’t need to record on the computer itself (though most of the time I do just record in Audition). I find that when I’m doing training recording that will be edited and combined later, it’s better to pull as much processing off the main computer as possible to improve the quality and performance of the screen capture process … so I record voice on the Tascam, video on a separate digital camera, and just the screen capture on the computer.
I do have a set of Meze classic headphones hooked up to the Zen Go, but I mostly listen to meetings and music throughout the day on a Klipsch Three.
Audio wise, I put up a set of acoustic panels along one wall. I’m certain I could do more here, but the panels plus the carpeted floor seem to do okay for keeping the audio sounding pretty clean.
Lights… I’ve switched back and forth between GVM and Neewer over the years. Right now I’m using two Neewer flat panel lights, one of which provides ambient light by bouncing off the ceiling—this is the only ambient light I normally have turned on. There’s another LED panel with a diffuser to my front acting as a key, and a spot with a strong diffuser as far away on my right as I can get it.
Well, that’s my working environment for the moment … if you have questions about why I chose specific pieces of gear, etc., please feel free to drop a comment here, or pm me on LinkedIn.
I’ve recently finished my 16th book (according to Goodreads, at any rate). This one is a little different than my normal fare—it’s essentially an expanded and revised version of the dissertation. Rather than being about technology proper, this latest is an examination of the history and philosophy of the superset of social media, which I’ve dubbed neurodigital media.
Fair warning, some readers might find this book a little … controversial.
From the back of the book—
Social media, shopping experiences, and mapping programs might not seem like they have much in common, but they are all built on neurodigital media. What is neurodigital media? It lives at the intersection of the Californian Ideology, the digital computing revolution, network ecosystems, the nudge, and a naturalistic view of the person. The Californian Ideology holds individuals should be reshaped, naturalism says individuals may be reshaped, and digital computing provides the tools, through network ecosystems theory and the nudge, that can reshape individuals. This book explores the history and impact of neurodigital media in the lives of everyday users.
Hedge 113: The PLM with Jeff Jakab
Over the last few episodes of the Hedge, we’ve been talking to folks involved in bringing network products to market. In this episode, Tom Ammon and Russ White talk to Jeff Jakab about the role of the Product Line Manager in helping bring new networking products to life. Join us to understand the roles various people play in the vendor side of the world—both so you can understand the range of roles network engineers can play at a vendor, and so you can better understand how products are designed, developed, and deployed.
Quality is (too often) the missing ingredient
Software Eats the World?
I’m told software is going to eat the world very soon now. Everything already is, or will be, software based. To some folks, this sounds completely wonderful, but—leaving aside the privacy issues—I still see an elephant in the room with this vision of the future.
Let me give you some recent examples.
First, ceiling fans. Modern ceiling fans, in case you didn’t know, don’t rely on the wall switch and pull chains. Instead, they rely on remote controls. This is brilliant—you can dim the light, change the speed of the fan, etc., from a remote control. No unsightly chains hanging from the ceiling.
Well, it’s brilliant so long as it works. I’ve replaced three of the four ceiling fans in my house. Two of the remote controls have somehow attached themselves to two of the three fans. It’s impossible to control one of the fans without also controlling the other. They sometimes get into this entertaining mode where turning one fan off turns the other one on.
For the third one—the one hanging from a 13-foot ceiling—the remote control sometimes operates one of the other fans, and sometimes the fan its supposed to operate. Most of the time it doesn’t seem to do much of anything.
The fan manufacturer—a large, well-known company—mentions this situation in their instructions and points to a FAQ that doesn’t exist. Searching around online I found instructions for solving this problem that involve unwiring the fans and repeating a set of steps 12 times for each fan to correct the situation. These instructions, needless to say, don’t work.
There is no way to reset the remote, nor the connection between the remote and the fan. There is no way to manually select some dip switch so the remote has a specific fan it talks to. Just some mystical software that’s supposed to work (but doesn’t) and no real instructions on how to resolve the problem. The result will be a multi-hour wait on a customer support line, spending hours of my time to sort the problem out, and the joy of climbing (tall) ladders to unwire and wire ceiling fans in four different rooms.
Thinking through possible problems and building software interfaces that take those situations into account … might be a bit more important than we think they are if software is really going to eat the world.
Second, the retailer’s web site—a large retailer with thousands of physical stores across the United States. Twice I’ve ordered from this site, asking to have the item held in the local store so I can pick it up. The site won’t let you order the item for store pickup unless they have it in stock.
The first time they called me to say they couldn’t find the item I ordered, but they found a “newer model” that was a lot less expensive. It was a lot less expensive because it wasn’t the same item. They never did find the item I originally ordered.
The second time they called me to say they couldn’t find the item I ordered. I asked if they could just ship the item to my house when it’s back in stock. “I’m sorry, our system doesn’t allow us to do that …” Several hours later, they called back to tell me they found it, but they cannot reinstate my order—I must place a new order.
Again, software quality strikes … what should be a simple process just isn’t. There will always be mismatches between the state in software and the state in the real world—but design the system so it’s possible to adapt when this happens, rather than shutting down the process and starting over.
Third, I own a car that has all the “bells and whistles,” including an adaptive cruise control system. There are certain situations, however, where this adaptive control does the wrong thing, producing potentially dangerous results. There is no way to set the car to use the non-adaptive cruise control permanently (I called and waited on the phone for several hours to discover this). You can set the non-adaptive cruise control on a per-use basis by going through set of menus to change the settings … while driving.
Software quality anyone?
Software eats the world might be someone’s ultimate dream—but I suspect that software quality will always be the fly in the ointment. People are not perfect (even in crowds); software is created by people; hence software will always suffer from quality problems.
Maybe a little humility about our ability to make things as complex as we might like because “we can always have software do that bit” would be a good thing—even in the networking world.
Off-topic post for today …
In the battle between marketing and security, marketing always wins. This topic came to mind after reading an article on using email aliases to control your email—
For example, if you sign up for a lot of email newsletters, consider doing so with an alias. That way, you can quickly filter the incoming messages sent to that alias—these are probably low-priority, so you can have your provider automatically apply specific labels, mark them as read, or delete them immediately.
One of the most basic things you can do to increase your security against phishing attacks is to have two email addresses, one you give to financial institutions and another one you give to “everyone else.” It would be nice to have a third for newsletters and marketing, but this won’t work in the real world. Why?
Because it’s very rare to find a company that will keep two email addresses on file for you, one for “business” and another for “marketing.” To give specific examples—my mortgage company sends me both marketing messages in the form of a “newsletter” as well as information about mortgage activity. They only keep one email address on file, though, so they both go to a single email address.
A second example—even worse in my opinion—is PayPal. Whenever you buy something using PayPal, the vendor gets the email address associated with the account. That’s fine—they need to send me updates on the progress of the item I ordered, etc. But they also use this email address to send me newsletters … and PayPal sends any information about account activity to the same email address.
Because of the way these things are structured, I cannot separate information about my account from newsletters, phishing attacks, etc. Since modern Phishing campaigns are using AI to create the most realistic emails possible, and most folks can’t spot a Phish anyway, you’d think banks and financial companies would want to give their users the largest selection of tools to fight against scams.
But they don’t. Why?
Because—if your financial information is mingled with a marketing newsletter, you’ll open the email to see what’s inside … you’ll pay attention. Why spend money helping your users not pay attention to your marketing materials by separating them from “the important stuff?”
When it comes to marketing versus security, marketing always wins. Somehow, we in IT need to do better than this.
Worth Listening: Heidi Roizen
Project AI+Compassion just interviewed Heidi Roizen about compassion in IT; it’s worth listening to. From the show notes—
Storytelling is a powerful way for humans to connect and for humans to move other humans to action. To understand your story and understand the stories of others, we can develop a lot more compassion for people when we understand that everybody has a story. Everybody story is important. So, don’t dismiss other people’s stories. Take the time to get to know everyone has a story.
How the Internet Really Works, Part 1
I’m a bit late posting this … but this Thursday (an odd day for me) I’m running How the Internet Really Works, Part 1, over at Safari Books Online. From the page:
This live training will provide an overview of the systems, providers, and standards bodies important to the operation of the global Internet, including the Domain Name System (DNS), the routing and transport systems, standards bodies, and registrars. For DNS, the process of a query will be considered in some detail, who pays for each server used in the resolution process, and tools engineers can use to interact DNS. For routing and transport, the role of each kind of provider will be considered, along with how they make money to cover their costs, and how engineers can interact with the global routing table (the Default Free Zone, of DFZ). Finally, registrars and standards bodies will be considered, including their organizational structure, how they generate revenue, and how to find their standards.
You can register for the training at the link above. I’ll be giving part 2 of How the Internet Really Works next month.