April 26, 2024

Software Defined <insert term here> & how to make Mogy proud

I have had a pretty diverse range of responsibilities in my career before coming to EMC. I have worked in fast food restaurants, pizza joints, department stores, an Avionics shop, an Internet provider, a Comm center, a state funded academic program, several insurance companies, a couple financial companies, and most recently EMC.

Despite the wide variety of places I have worked since I became of working age, the overall goals (from an abstracted and simplistic method) are pretty much the same. Provide a product or service the provides value to the customer while doing so in an effort to achieve an expected result. Again, this is a simplistic view, but a fairly accurate one.

Processes that worked Fast_food
When I worked at my 1st fast food restaurant, I didn’t really think about process type things like store layout, workflow of burger patties getting moved from the freezer to the prep area, then to the grill, to the assembly station, into the warmer bin, and so on.  To be honest I was just happy to have a job, making my whopping $3.35 an hour.  Looking back, I think about the process of getting burger patties from the freezer to the customer.  There has to be some intelligence there.  This chain had been around for many years, an I’m pretty sure they had some trial and error, and came up with a pretty streamlined process of meal preparation and sale.  I can visit any restaurant of the same franchise today, and the process hasn’t changed much.  The store layout may be a little different, but the given workflow, at the core of the operation, still works.  I know, from time to time (like lunch hour), things can get busy, and some hustling can be needed to keep up with demand.

haberdasheryWhen I worked in department store positions, whether it was my time in electronics or in clothing, we also had workflows.  When I worked in the Electronics Department, we demonstrated products, sold products, ordered replacement stock, and so on.  Working in clothing, we showed styles of suits, provided fittings, marked clothing, sent it to alterations, and contacted the customer when the clothes were ready.  Both of these positions had an expected amount of customers, depending on the time of day, time of month, and especially the time of year.  With the Christmas season traditionally being the busiest, we always hired temporary workers to help service customers in an appropriate amount of time.  Thinking back, we really scaled up each season to ensure we maintained an acceptable service level for our customers.  We scaled up, and scaled down, on an orchestrated schedule, based on some dynamic parameters each season.

Processes that didn’t work
I’ve had various technology roles from Technical Specialist to IT Architect as a customer. Many of the tasks/projects/responsibilities I have had to do over the years had subtle intricacies that were “because that’s the way we do it” or “it has just always been done that way.” Some of those were looked at as inefficient and either replaced, phased out, or simply left in place because of a business need.

As an IT Architect, I ran across a particular situation, where we had hundreds of computers that needed some settings changed every six months or so. Most of these changes occurred in an interactive fashion, and required someone to manually make the change. Given the number of systems, it took about 50 people the better part of a month with a couple hours each day changing the configuration of as many systems as could be changed over in a single day. Changing the settings on a single computer, including validation/error checking took anywhere from 10-30 minutes. It was very inefficient to say the least, as well as prone to human error.

I looked at the issue, and thought “I ought to be able to script this.” Without my boss’s approval, I wrote a Visual Basic script that would automate the process for a single computer and provide validation/checking for the changes. When run, this script took approximately 10 seconds to run, regardless of the complexity of the individual computer’s configuration. As a result, the 50 people x 2 hours a day x 20 working days (2000 hours) was transformed into a 5 people x 4 hours x 2 working days process (40 hours). That’s a significant difference. Writing the script had taken 8 hours of my time. I hadn’t solved the problem, but I had standardized the change process, provided reporting, and removed the possibility of human error (other than my own coding) in making the updates. Because this was not a “sanctioned” script, I didn’t get the opportunity to expand on it from an enterprise perspective. Despite that fact, it became the de facto standard for this particular change process.

Could it have really been great? Absolutely, but I didn’t have management buy in to take it further.

In another role, I had to generate some reports several times a week in an Excel workbook. Pull this, copy that, paste this, and so on. The whole process took about 3 hours. Why? Because there was no convergence of the data and no easy way to put it all together. After a few weeks, I got a little frustrated and thought to myself “I ought to be able to script this.” Some data was in Oracle, some in MS SQL, and some in DB2. Yuck. A little time with some discovery from the different database owners, and a little after-hour (translated: unsanctioned) development… I had an Excel Macro. Click execute Macro, and 30 seconds later, I had my results. I forgot to mention, there were 3 other folks creating this report (at 3 hours a day, 3 times a week also). That’s 4 people x 9 hours a week, or 36 hours a week. Spending a couple hours after work, I had changed a 36 hour process into a 6 minute process (4 people x 30 seconds x 3 times a week). Needless to say, my co-workers were thrilled when I shared my macro with them, and they could free up time to work on other important tasks.

When my boss, and director, learned of the macro, I was told simply “It is not your job to code.” I was let off with a warning. Although I had made a process more efficient, “coding” wasn’t a responsibility of my role, and was told to cease and desist from any additional “optimization development.”

Having a plan
thompson-redoctober-slideIn the situations above, I knew that we were working inefficiently.  How was I going to fix the issue in each case, without wasting my time, or the company’s time?

I think back to the movie “The Hunt for Red October” when Fred Thompson’s character tells Jack Ryan (Alec Baldwin) that “Russians don’t take a dump, son, without a plan.”

That statement is the result of his character’s experience in the military, knowing that all situations need to be analyzed before being acted solely upon with only the given data. He knows that winning a battle and the war are two completely different things.

Implied by Fred Thompson’s statement, is to take all data into account for a given time period and determine how that can be used to make the best decision for a course of action.

Using the example of my Windows Visual Basic Script mentioned earlier, data collection included things like what changes occurred, what was the manual change process, how often it had to occur, as well as what was the validation process.

For many virtualization admins, the process of justifying a virtualization first policy really fits into this mold. How long does it take to deploy a 1U pizza box? Order it, receive it, rack it, update the bios, install the OS, patch it, install the application or service on it, test it, put it into production, and so on. Need another? Rinse and repeat. Before we had a virtualization first policy, I detailed how many physical systems we had and how long it took to put them in place. Because I knew the policy was going to really take off, I put other things into place to ease my deployment. Standardized templates, updated template patching policy, some PowerCLI scripts to quickly deploy vApps, and the like helped me on my way. I had won a battle and had a plan to win future battles. But I was still lacking something…

When developers asked for a new development environment, I still had to spin one up. I had to make sure they had completed system request case, make sure they had a manager’s approval, blah, blah, blah. “Where is my dev environment?” was not an uncommon email/phone call/voicemail for me. Then the back and forth began… Get this approval, we’ll prioritize it, and get to it sometime in 2016 when we have the time to do it. That wasn’t the norm, but it seemed like it. More often than not, it turned out to be a late night/deploy from jobsite #2 (translated: home office). I kept thinking to myself… Why am I killing myself doing all of this repetitive work?

Getting smart about the process
During the 1930’s, Allan H. Mogensen (Mogy) held many conferences and workshops around “work simplification.” He also played a significant part in the standardization of flow chart and process chart symbols. I’m not going to go deep into his teachings/views, but suffice to say, to some degree he is a founding father when it comes to taking business processes and streamlining them for efficiency while respecting a worker’s input to the process. He believed that the person actually doing a job probably knows more about that job than anyone else and is therefore the one person best suited to improve it. He coined the phrase “work smarter… not harder.”

As an Engineer, I would write scripts all day long to make sure that I didn’t have to do something more than 3 or 4 times… But those scripts are manual, have a tendency to only be understood (translated: deciphered) by me. I was working a little smarter.

A New Enabler – SDx
Now Software Defined <insert term here> is all the rage. GOOD! Everyone is adding the ability to automate through APIs/REST interfaces/SDKs/etc. This is good! Believe it or not, some of this has been around for a while, albeit called something else – Integration. A good example of this is the EMC Virtual Storage Integrator for visibility/control from vSphere into EMC arrays as well as Unisphere’s integration into vCenter. Integration is great, and can provide some wonderful capabilities, but it is still management of a particular entity.

I always explain server/desktop virtualization to non-techy people as “one computer acting as many computers.” VMware vCloud Director, with the capability of creating Organizational DCs, which are given resources from a Provider DC, could be described (using the same analogy), as making a datacenter look like many datacenters through the use of some degree of abstraction. Items like the application (in a vApp), an x86 workload (in a VM), a network, and a firewall can all be abstracted from their physical resources.

With workloads being given the ability to run on abstracted resources, even more things can happen. Yes, all of this can be leveraged manually, but why? The addition of workflow orchestrated processes and intelligent monitoring that can scale an application based on workload, are the next steps. Software Defined <insert term here> is an enabler that makes this possible.

Moral to the story
As a food service worker and a retailer, we had processes/protocols/whatever you would like to call them, that worked. Period. Most of those processes are pretty much the same today.  Yes some things have changed with how we pay/order/consume products and services, but for the most part, in a simplistic form, they haven’t changed.  Go compare any two fast food chains as to how they operate, and I’m pretty sure they are going to operate in a pretty consistent manner.  The same goes for department stores.  I’m speculating here, but I would imagine that as processes didn’t work/flow well, subtle changes were made to make things operate more efficiently.

In a couple other roles, I attempted to make some things easier for myself and some colleagues, and they worked, but in small quantities.  They were not sanctioned by management, and in the grand scheme of things didn’t make a significant impact to the business.  I made some teammates happy, but nothing major from a business perspective.

True business process transformation requires management buy-in/ownership, leveraging the knowledge and experience of people who know the enterprise’s processes most intricately.

Now, imagine if my management had provided support behind my projects… With real tools…

What could we have accomplished?

I’m pretty certain Mogy would be proud of us.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.