Thursday, July 02, 2015

Rethinking the "I" in IDE (Integrated Design Environment)

Most developers that I know take it for granted that the right way to develop an application is to use an integrated design environment (IDE), and in response to that expectation most development tool vendors either provide their own IDEs or provide plugins for popular IDEs.

I've had some recent experiences that are making me doubt the wisdom of the capital "I" in the IDE approach...

(If you scrutinize my LinkedIn profile, you'll likely figure out which companies I am alluding to in the following examples, but please don't. This post is meant to share insights and lessons-learned, not to provide fodder for competitive bashing).

From Thick (Eclipse) to Thin (browser) IDE 

One of the companies that I worked for a few years ago had a great integrated design environment (IDE) for creating and maintaining process applications. This environment was built on the extremely popular Eclipse IDE, and was itself extremely popular with our developers - but the world changed...

When we created our IDE, the norm was for developers to install "thick client" IDEs on their own workstations. The installation required a big download, lots of free space on their hard drive, and administrator privileges on the workstation for a successful install. Beyond that, it took several minutes for the installation process to run, but otherwise getting the IDE setup was relatively painless and error free.

This user experience (UX) for setting up the IDE was perfectly fine for the personae that we originally targeted - frequent process app developers. These folks spent a lot of time in the IDE, generally had admin privileges on their workstations, and were quite frankly brainwashed by similar download and install procedures from other development tools.

So far so good, but as our company's product gained traction, we found new personae to address, folks who used the IDE sporadically and infrequently. What we discovered was pretty obvious in hind-sight:  After the initial solution was created and deployed, the need to make changes to the solution were very infrequent. When changes need to be made, they could generally be made quickly - assuming that the IDE was already installed on the developer's workstation.

For the infrequent/sporadic developer persona, the "download and install" UX for our IDE didn't pass muster. The advantages of a thick-client IDE were outweighed by the "download and install" overhead.

In response to our need to support this new persona, we in product management decided that we needed to replace our thick-client IDE with a thin-client IDE, specifically with a browser-based IDE that would only require the installation of a browser plugin. I'm sure you can anticipate our development teams response:
"Great idea, but that will take years to build."
Our IDE provided significant functionality, literally dozens of types of artifacts made up our process solutions, and for each artifact type we had a tailored editor - and many of these editors interacted with related editors. Re-implementing the entire functionality of the Eclipse-based IDE with a new browser-based IDE was a monumental task.

The solution that we arrived at was a compromise. Instead of trying to re-implement everything in a "big bang" effort, we chose to start with a hybrid approach. We built what we believed to be a minimal viable product (MVP) browser-base IDE that provided navigation links to the Eclipse-based IDE. Our hope was that the UX of this hybrid approach would be acceptable to both the frequent and sporadic developers.

On the initial release, the UX for most developers was pretty awful. We didn't have quite enough in the "new" thin IDE to minimize the need to switch over to the old "thick" IDE. We didn't have enough in the thin IDE for the infrequent developers, and we'd put some new stuff into the thin IDE that the frequent developers really wanted to utilize. We were pretty quickly able to improve the UX, especially for the infrequent developer persona, but it still feels like a Frankenstein experience for many of the frequent developers.

It will take years to really provide a delightful design experience for all of the personae that we were targeting.

From Thin (Microsoft Silverlight) IDE to Thin (HTML5/JavaScript) IDE

Fast forward a few years, and I encountered another variant of "we need to re-implement our IDE". In this case, the company had already made the expensive switch from a thick Eclipse-based IDE to a thin Microsoft Silverlight-based IDE.

The Silverlight-based IDE is great, but (no surprise if you pay attention to such things) the world changed and the future of Silverlight is not bright. In response to this, we in product management once again decided that we need to re-implement our IDE to remove the Silverlight dependency. The response from the development team?
"Great idea, but that will take years to build."

As Yogi Berra might have said: "It's déjà vu all over again".

There must be a better way

Twice burned. As Scotty would scold: "Fool me once, shame on you. Fool me twice, shame on me." 

I won't pretend to have a solution for this conundrum. It's a fact of our industry that technology, particularly user interface (UI) technology, changes frequently. The UX of our products is the single most important factor in the success of our products, so we have to incorporate the latest UI improvements or our products will slowly die. I get that.

What I don't accept is to continue to implement IDEs in the same old way. There are a lot of great new IDEs out there, such as Cloud 9, Visual Studio Code, and Google Web Designer - but I fear they are repeating an inherent flaw in reasoning:
Tight integration (the "I" in IDE) will inevitably lead to expensive (in terms of effort and time) re-implementation.

Rethinking the "I" in IDE

 So let's take a trip back in time and review why IDEs were created in the first place... IDEs were created to increase developer productivity and to improve the user experience for developers.

Prior to IDEs the UX for developers was pretty horrendous and time consuming for many. Not to knock the extreme power that command-line utilities can provide, in the pre-IDE world the onus was on the developer to switch between tools to create artifacts and keep the elements of a solution in sync with each other. Integrating all of these tools into a common framework provided a much better UX and made developers much more productive.

Designing tools so that they work well with other tools is a good thing - A very good thing. Of equal or more importance is to design tools that do one thing very, very well.
Tools that try to do too many things are often not very useful for anything.

A good starting analogy for an IDE is a tool box. You need a box in which to store your tools, and it would be great if the box also helps you organize those tools so you can get to them when you need them.
Analogies always break down at some point - but I think this is a good one. An IDE is there to help you find your tools, and those tools are designed to complement each other to do things that the individuals tools cannot do themselves.

I fear that traditional IDEs have unwittingly fallen into the "Swiss Army Knife" trap:
An inherent flaw of the Swiss Army Knife is the need for each tool to conform to the constraints of the handle. This impacts the utility of each tool, but that's not really the concern I'm trying to address so I'm going to have to pose a "not very likely" question:
What would the impact be on the blades and tools of a Swiss Army Knife if you radically changed the handle?
If you radically changed the handle of a Swiss Army Knife, you'd have to re-engineer all of the blades and tools. That's essentially what happened when we found it necessary to re-implement our IDEs. The tools within the IDE were tightly integrated with the IDE. They could not function on their own.

I think understanding this dependency is the key to finding a solution to the "we have to re-implement our IDE" conundrum. We've got to develop tools that work just fine on their own, but with interfaces that allow them to be managed by our IDE (perhaps we should use the abbreviation iDE instead).

I found a really good poster child analogy for this approach, IFFT (If This Then That). This is a pretty nifty service that lets you get information from one service and use that information to invoke an action on another service. For example, if you receive an email (in gmail) with specific characteristics, IFFT will pass that information to Evernote to store it.

IFFT is in a very loose sense an IDE because it combines multiple tools in an integrated platform that allows you to create a solution that the individual tools could not implement on their own. Obviously this isn't even close to the functionality that a "real" developer needs, but it's a great start.

Our iDEs need to be more like this. Each tool should be stand alone (like an App), and provide APIs that allows the tool to be externally managed. The iDE provides that management.

Admittedly, the UX for the developers with an iDE might not be as delightful as the experience that they've come to expect from their IDE. It won't be as easy to come up with a delightful experience in a loosely coupled experience as it was to delight folks with a tightly coupled experience. That said, I think it's pretty clear that we have to try, or the need to re-implement our IDEs every few years will continue to drag us down.

Thanks to all of my colleagues who've helped me come to this realization - You know who you are. And to all those who've been kind enough to read my ramblings, please let me know what you think.

Thursday, March 26, 2015

Business Process Performance and Operator Productivity

I have recently been having some very good discussions with my colleagues regarding the subtle differences between improving the performance of a Business Process and improving the performance of an Operator...

Let me introduce a scenario that will hopefully clue you in on what we've been discussing...

John has filled out a Loan Application form, and mails it to SomeInsuranceCompany. (Yes, John is old fashioned and is using paper... You'd be surprised how many important business transactions still start with paper.) 
When John's Loan Application arrives in the mail room at SomeInsuranceCompany, it is immediately scanned in, and the information that John wrote on the form is extracted from the image via a number of miraculous (and patented) marvels of technology. 
The technology for extracting actionable data from images is quite good, but there are still things that the computer can't figure out.  Fortunately, people can sometimes extract information from images that computers can't.  For example, people are generally able to decipher bad handwriting better than computers (that's the premise for captcha too). 
Scott works for SomeInsuranceCompany in the team of people who deal with all of the forms that the aforementioned miraculous marvels of technology can't figure out. Scott's team is there to take a look at the images of the problematic Loan Applications and try to make sense of them. The alternative would be to reject these applications, and that's something that SomeInsuranceCompany wants to avoid. 
SomeInsuranceCompany wants Scott to be as productive as possible. They want to maximize the number of Loan Application images that Scott can process during his shift. This is a classic problem on how to improve operator productivity. 
Once the data has been extracted from the images, the Loan Applications are routed for review and approval through a number of activities. SomeInsuranceCompany wants to minimize the time that it takes to let John know whether or not his Loan Application has been approved. This is precisely the sort of scenario that Business Process Management deals with.

I hope that you are still with me... that was a pretty big setup, but probably necessary before I ask this question:

Would the use of BPMN help you improve Scott's productivity?

To me, this question is rhetorical... The answer is no.  BPMN is not very helpful in improving the productivity of an operator performing a single activity.  It was not designed for that use.

Scott's productivity will more likely be improved by analyzing his User Experience with the tools that he uses to review the images.  We need to track his eye movements, number of key strokes and mouse clicks, etc.  Perhaps there are process improvements that can help reduce his fatigue over the course of the day, but it's still mostly a matter of "Eyes and Fingers" (as a colleague of mine is fond of saying).

So what's my point?

BPMN is a great tool to have in your toolkit, but it's not the best tool (or even a good tool) for all of the things you need to do to improve your business operations.  A purpose built tool for the job at hand will always serve you better, so fill your toolkit with the best tools you can find for the work that you need to do.

Wednesday, September 17, 2014

The Holy Grail of the Remodelled Business App

For many years we've been singing variations of a song about composing applications from reusable building blocks - those blocks may be services a-la SOA or they might be UI widgets, but the idea is the same.  Build things that will be of value for multiple solutions - not just one solution - and you will lower your overall cost of development and maintenance and you will speed (subsequent) application development.

I don't think anyone seriously questions the value of this approach... but I question whether or not it's really the Grail that we seek as business software developers.  It's a step on the path, but not the destination.

The destination, I believe, is hinted at in the following scenario:
John needs an application to help solve a particular business problem. John's kind of a lazy person, so he really doesn't want to build an application from scratch... Instead, he looks around for an application that's pretty close to the application that he needs, and then he copies that application's source as his starting point.  He creates the app that he needs by simply modifying and replacing the portions of the "pretty close" app that don't match what he wants.
For this to work, the "pretty close" application needs to be designed to be easily modified - and the author needs tooling that is optimized for modification - rather than for "from scratch" construction.

Returning to my favorite "software developer is like a home builder" analogy, these are tools for re-modelers and house flippers rather than for new home builders.

This does bring us back to those reusable building blocks.  It's likely that a composite application (one that is composed from building blocks with well thought out interfaces) will be easier to modify than a monolithic application... but only if it's really clear to the remodeler how the blocks have been wired together (so you can rewire them, replace them, etc).

The original structure of the application, just like the original structure of a house, will govern how easy it is to "remodel" the structure.  To be truly successful at reusing our work in the future, we need tools and frameworks that guide us to build "remodel-ability" into our business apps from the very beginning.  That will likely cramp the style of many creative programmers, but it's what we need.

So that's the Holy Grail we seek:  A land where Remodeled Business Apps are the norm.

As always, please don't hesitate to throw rocks at my ramblings if you disagree. That's how we learn.

Friday, August 22, 2014

Recasting BPMN in it's proper place

I had a great discussion this week with a professional frienemy (a BPM enthusiast who works for a competitor) on the wisdom of implementing a process engine that directly executes BPMN (business process modeling notation).  As so often happens during a great exchange of ideas, thoughts that had been rattling around my subconscious for quite some time collided and gelled into a eureka statement:
Directly executing BPMN makes no more nor less sense than directly executing insert your favorite programming language here.
BPMN is a graphical notation for describing the business logic of a process. It's particularly well suited for describing processes that have a sequential flow of process activities.  If you wish to automate a business process, it is a much more productive use of your time to model the process with BPMN than it would be to code your process logic in some computer assembly language.

Let's recast the previous paragraph to substitute the Java programming language as the subject - Java is a high-level programming logic that's particularly well suited for System Programming.  If you wish to create a software application, it is a much more productive use of your time to code the application in Java than it would be to code the application in some computer assembly language.

The norm in Software Engineering is to write applications in the programming language that most closely maps to the problem domain, and then compile the results to the common primitives (assembly language/machine code). This allows authors to use a wide range of programming languages - yet execute all of the resulting programs on the same "engine".

The same should apply to executing business oriented programming languages. Use the language that most closely maps to the problem, and then compile down to the common primitives. This allows authors to use a wide range of tailored business programming languages that can all execute on the same "engine".

My personal (and simplistic) concept of that business engine looks more like an Event Processing Engine than a Workflow Engine...

  1. A Business Event occurs
  2. Logic is executed
  3. Actions are initiated
Processes defined with BPMN map very nicely to to this "engine":
  1. A Process Activity completes (a special type of business event)
  2. The BPMN Process Logic is executed
  3. The "Next" Activity in the Process is initiated
I'm pretty sure just about any practical Business Language that we come up with can be compiled down to these same "event processing" primitives - giving us way more flexibility than a BPMN specific engine.

Please feel free to throw rocks at the holes in my premise... that's how we learn and grow.

Friday, March 28, 2014

The curious tale of the passionate programmer...

Sometimes I question why I am attending a specific conference... The topics that are being covered are familiar, as are many of the presenters, and the pressures and deadlines of my job seem way more pressing than anything I might learn at the conference.

I have never felt this way about bpmNext... It's somewhere between a family reunion and the BPM practioner's version of Woodstock.  I always learn a lot, and I am never disappointed.

This year was just as much fun as last year, and as a special treat I got to meet Tom Baeyens in the flesh.

For those of you who don't know who Tom is, I like to describe him as the guy whose code probably runs more BPM solutions than any other individual.  Tom is the father of both jBPM and Activiti... by far the most pervasive Open Source BPM engines in the wild.  Of course It took a village to craft and polish both engines, but it was Tom's vision and passion that guided those teams.  His influence on BPM is undeniable.

Tom and I have been exchanging emails and forum threads for years, but no matter how well you think you know someone "online", you'll always be a bit surprised when you get to chat with them over a beer or a glass of wine.  Such was the case at bpmNext...

Apologies in advance if I don't recall the details accurately, but Tom told a story of the early days of jBPM that I would love to share...

As a young and passionate programmer, Tom had created a Workflow Engine that he felt had promise.  He'd written some code, and proved that it worked, but in addition to being passionate and young, he was also unfortunately poor.  Great idea, but he needed to pay the bills.  Tom knew he needed to attract the attention of a larger company to realize his dream, and JBoss was the logical choice...

What came next was one of those stories that I just love... I am sure I will muddle the details, but hopefully convey the spirit...

Tom learned of a JBoss training session that was close enough for him to attend... He had no interest in the classes, but this was his chance.  Tom coughed up the 500 Euros fee, packed up his laptop, and attended the conference.  Once there, he lurked in the hallway and stalked the JBoss staff until he was able to pull them aside and show them his project running on his laptop.... The rest is history (aka jBPM).

I've of course romanticized the episode, I love embellishing stories, but to me this is the classic myth towards which all programmers aspire.  Have a great idea, passionately execute the idea, and attract a sponsor to bring your idea to the market.  The modern version of Homer's tales.

Tom's now launched a new commercial effort called Effektif - pronounced "effective" - and I think it's going to be a winner.  Check it out, and dream about what your own passion for programming may accomplish some day.

Monday, January 06, 2014

Chromebook Thoughts

I bought my wife Teri a Chromebook for Christmas... an Acer C720p.  In honesty this was a gift for both of us... Teri is an educator who has wanted to learn more about Chromebooks and I can't resist the temptation to try out everything new.

Total cost was just under $300, bundled with a Chromecast  (another new gadget to play with).

My impression was that it's quite a nice piece of hardware for $300, and it simply works.  Turn it on, log on with your Google account, and it just works.  If there's anything that can be screwed up, I haven't found it yet.

The case is gray plastic, nothing fancy but not cheap feeling... slightly larger than an iPad but still easy to tote around:

The screen's not bad either... and it's touch sensitive.  The Touch screen added $50 to the cost, but I think it's well worth it:

The real test is of course in the answer to the question: "How useful is this thing?"

The answer lies in your need to use "traditional applications"... which certainly differs from person to person and business to business (or school to school).  Chromebooks are almost a no-brainer for schools that have adopted Google Docs - but almost a non-starter for those reliant on legacy applications.

Teri's very positive about her Chromebook as a "student computer" thus far, but did hit one snag already... Here in New Mexico many schools use Net Logo extensively in their curriculum, and thus far it appears that there isn't a cloud-based version available. This is likely true for many other programs teachers rely on, and a stumbling block for wider Chromebook adoption by more schools.

I've seen recent press about Chromebooks really taking off this holiday season - and counter articles deflating those claims a bit.  I have no way of knowing for sure either way - But for me the value for the price of this little Acer machine is terrific.