June 21, 2016 brianradio2016

In the last decade, Apple truly revolutionized client computing, moving it out of the boring Wintel PC space where nothing significant had happened for years. We got the iPhone and then the iPad, and with them very different ways to compute for work and pleasure, plus a break from the whole “IT owns everything” mentality that imprisoned users. We got Siri, a big step into the “Star Trek”/”2001: A Space Odyssey” computing future many of us grew up with. We got devices that interacted together as parts of something greater (aka liquid computing).

Then those all became normal, available from Google, Microsoft, and others. Sure, we get advances every year in iOS and, to a lesser extent, MacOS — but they’re typically incremental, and often pioneered by other platforms. Apple is now just as much a follower as a leader. That was crystal clear at this year’s Worldwide Developers Conference, a fairly minor affair — just like Google I/O and Microsoft Build before it.

Basically, Apple won the revolution and we now live in that world. In the new status quo, Apple is no longer the pioneer or insurgent but simply a leader in the new establishment.

That doesn’t mean Apple is doomed, as some pundits claim. Nor does it mean that Apple now drives the direction of technology, as other pundits claim. Both are extreme, reductionist positions. But Apple is navigating in new waters where its strengths may not be enough.

June 17, 2016 brianradio2016

IDG.TV | Jun 16, 2016

Your USB drive isn’t slow because you have too much stuff on it. It’s slow because it uses a slow storage format like FAT32 or exFAT. You can re-format it to NTFS to get faster write times, but there is a catch.

June 16, 2016 brianradio2016

The growth of Apache Hadoop over the past decade has proven that the ability of this open source technology to process data at massive scale and allow users access to shared resources is not hype. However, the downside to Hadoop is that it lacks predictability. Hadoop does not allow enterprises to ensure that the most important jobs complete on time, and it does not effectively use the full capacity of a cluster.

YARN provides the ability to preempt jobs in order to make room for other jobs that are queued up and waiting to be scheduled. Both the capacity scheduler and the fair scheduler can be statically configured to kill jobs that are taking up cluster resources otherwise needed to schedule higher-priority jobs.

These tools can be used when queues are getting backed up with jobs waiting for resources. Unfortunately, they do not resolve the real-time contention problems for jobs already in flight. YARN does not monitor the actual resource utilization of tasks when they are running, so if low-priority applications are monopolizing disk I/O or saturating another hardware resource, high-priority applications have to wait.

As organizations become more advanced in their Hadoop usage and begin running business-critical applications in multitenant clusters, they need to ensure that high-priority jobs do not get stomped on by low-priority jobs. This safeguard is a prerequisite for providing quality of service (QoS) for Hadoop, but has not yet been addressed by the open source project.

June 14, 2016 brianradio2016

Digital transformation, aka DX, is hot — and if you’re not doing it, your company will die and you will lose your CIO or IT leadership job. You’ll — shudderbe disrupted! Or fail the wrong side of the Innovator’s Dilemma. That’s the message over the last few months from consultants, pundits, and of course vendors.

But wait — wasn’t digital transformation hot in the late 1990s and again in the mid-2000s? Indeed it was. It’s hot again, mainly for the wrong reasons. That is, vendors want you to buy stuff because IT has been cutting back.

There is a very good reason to invest in digital transformation. But it’s not the digital transformation you’re usually sold.

First, here’s a typical definition of digital transformation that means the same things consultants and vendors have been saying for years: keep up with new technologies, and use them — only the technologies have changed:

June 7, 2016 brianradio2016

There’s a lot of sci-fi-level buzz lately about smart machines and software bots that will use big data and the Internet of things to become autonomous actors, such as to schedule your personal tasks, drive your car or a delivery truck, manage your finances, ensure compliance with and adjust your medical activities, build and perhaps even design cars and smartphones, and of course connect you to the products and services that it decides you should use.

That’s Silicon Valley’s path for artificial intelligence/machine learning, predictive analytics, big data, and the Internet of things. But there’s another path that gets much less attention: the real world. It too uses AI, analytics, big data, and the Internet of things (aka the industrial Internet in this context), though not in the same manner. Whether you’re looking to choose a next-frontier career path or simply understand what’s going on in technology, it’s important to note the differences.

A recent conversation with Colin Parris, the chief scientist at manufacturing giant General Electric, crystalized in my mind the different paths that the combination of machine learning, big data, and IoT are on. It’s a difference worth understanding.

The real-world path

In the real world — that is, the world of physical objects — computational advances are focused on perfecting models of those objects and the environments in which they operate. Engineers and scientists are trying to build simulacra so that they can model, test, and predict from those virtual versions what will happen in the real world.

June 6, 2016 brianradio2016

If you haven’t seen the HBO show “Silicon Valley,” it’s worth the time. Along with many jabs at well-known tech companies, it’s surprisingly accurate with regard to how IT and developers think and behave. It’s not perfect, but I can’t recall ever hearing a vim-versus-emacs debate in a popular television show. (Can you?)

That same episode used an argument of spaces versus tabs in source code as a plot device. I’m sure most people watching the show weren’t entirely sure what that meant, but it was carried off well enough to make clear it was a big problem. Frankly, to many developers, it is a big problem and has been for quite some time. Truly, however, vim versus emacs is a much holier war.

Besides the obvious “neato” factor of deep techie culture surfacing in a mass-market television show, there’s a bigger point to be made. The parody serves to underscore that some problems we face have multiple right answers, and we often waste an awful lot of energy in pointless debates. Also, it’s best to have a development style guide if possible, so at least there’s a standard arbiter of such issues.

Some aspects of IT have only one right answer: the right route in a core switch, for example, or the right entry in an ACL that makes everything work. But in other places, the rules aren’t so binary. A great example is naming, of every variety: naming of servers, storage, networks, variables in source code, fields in databases, keys in arrays.

May 31, 2016 brianradio2016

Artificial intelligence — in the guises of personal assistants, bots, self-driving cars, and machine learning — is hot again, dominating Silicon Valley conversations, tech media reports, and vendor trade shows.

AI is one of those technologies whose promise is resurrected periodically, but only slowly advances into the real world. I remember the dog-and-pony AI shows at IBM, MIT, Carnie-Melon, Thinking Machines, and the like in the mid-1980s, as well as the technohippie proponents like Jaron Lanier who often graced the covers of the era’s gee-whiz magazine like “Omni.”

AI is an area where much of the science is well established, but the implementation is still quite immature. It’s not that the emperor has no clothes — rather, the emperor is only now wearing underwear. There’s a lot more dressing to be done.

Thus, take all these intelligent machine/software promises with a big grain of salt. We’re decades away from a “Star Trek”-style conversational computer, much less the artificial intelligence of Stephen Spielberg’s “A.I.”

May 31, 2016 brianradio2016

Much has been said in this space on the continued attacks on encryption by politicians across the globe. This demonization of the mechanism that holds the Internet together is as enduring as it is inexplicable. As I’ve said before, it’s impossible for anyone who works with network or data security to accept any argument that includes implementing a master key or backdoor in encryption standards.

In general, two sides of any particular issue will have some overlap. There may be discussion and argument on the best method to achieve a certain goal, but at least there’s agreement on the goal. In the case of encryption, however, there’s no common goal. The major issue is the technologist understands that encryption is a binary concept — either an item is unbreakable, or it’s insecure. There’s no middle ground, no gray area. You either have strong, unbreakable encryption … or you don’t. An encryption standard with a built-in backdoor is breakable encryption. It’s insecure by design.

A generational gap seems to come into play here. As I discuss these topics with my father, I note that this concept is as difficult for him to grasp as the converse is to me. I’m as puzzled by his thinking that an encryption standard with a master key is acceptable as he is by my belief that encryption isn’t secure if there’s a backdoor that only authorities can use. It’s an impasse, and I think most of the reason for this stark conflict is generational in nature.

My father grew up in a physical world. In my father’s world, if the authorities needed to find evidence or information on a suspected criminal, they had to get a warrant and execute that warrant in person. They had to obtain physical evidence to prove a crime was committed. This may have included safecracking, breaking down doors, or any other breach of a secured space. As long as the warrant was granted for that space, then the search and seizure was legal and acceptable by society’s standards.