Tuesday, February 13, 2007

SOA and EA Convergence?

I happened to read these articles about SOA and EA Convergence ....

http://weblog.infoworld.com/realworldsoa/archives/2007/02/soaea_convergen.html
http://weblog.infoworld.com/realworldsoa/archives/2007/02/open_group_deba.html

Hmm ... I think the author is confused. In my mind, EA is a process where is SOA is an approach.

When one looks at frameworks like TOGAF and FEAF, its clearly about the architecture process. These frameworks also encapsulate a detailed methodology of their own. Net-net, its about steps and best practices in understanding as-is architecture, bring in different drivers to define to-be architecture. EA approaches/process doesnt care about different pieces of technology or how the business processes are defined. Its just about providing principles for creation of sound architecture foundation satisfying the drivers of today and tomorrow.

On the other hand, SOA is an architectural approach - it does care about the way processes are defined enabling service orientation. It does care about specific pieces of technology and resulting interoperability between various components. It does expect such principles as reusability is put to action from top to bottom.

So, one should be able to apply EA principles/process to SOA approach... isnt it? In the process of applying EA principles to SOA approach, some specific steps may have to be redefined - such as QOS/SLA as SOA promotes organization interoperability.

Any thoughts folks?

Individual Wealth Management Solutions - Security Implications ... some thoughts

Private wealth management is becoming one of the hottest area in the financial services sector. With internet technology enabling the expansion of financial services sector, its not just ultra wealth investors (in tens of millions of dollars to invest), but also small time investors (with few hundred thousands), looking to invest in variety of financial instruments.

With an understanding that the value chain in the wealth management industry will become expanded and move towards global distribution and the access to information by several parties in the value chain will be through several channels, security around information becomes the topmost concern. The need for tightening information security is further heightened with intrusion of transported data is on the rise resulting in increased identity theft activities.

Following are some of potential implications around information security as applicable to distributed wealth management environment:

Distributed Identity management:
Authentication, authorization and managing the accessing party’s identity will be extremely difficult. The core portfolio and financial product information is assumed to be maintained with the sponsor. However, the participants in the distributed value chain will be in a matrix kind of virtual group. For example, an advisor could be an independent entity or could belong to a small organization of advisors or even belong to a sponsor. The participating investor could be an independent consumer or a institutional investor. The relationship between participating actors is a complex one and their core identity information is likely to be distributed throughout the value chain without being present in single location. In this situation, authentication and authorization of participating actor becomes extremely complex. Federated identity management is a possible way to address this matrix kind of participant environment (or exchange type of environment). But there are no clear standards or guidelines that will address issues and implications in this kind of wide variety of consumer-participating-exchange kind of environment. In addition, emerging federal standards such as multi factor authentication makes it more complex to evolve further on federated identity management concept.

Multi channel information access:
With proliferation of usage of internet protocol, the information access is being facilitated through several channels – thin browser clients, voice activated access clients and mobile devices such as cell phones and PDAs. Combined with many-to-many participant relationship, this ability to access information through multiple channels poses serious security implication. While security aspects around information access through a browser is getting stabilized, the maturing voice and wireless technology with changing standards make it difficult to adopt to specific implementation methodology/standards.

Protection during data/information transportation:
Again, as an effect of expanded nature of distributed participation, the number of hops the data/information has to go through the value chain is increased. The data/information is moved around between private and public transport mechanisms at several points within the value chain due to compound nature of transactions. Though technologies like cryptography and PKI are available to insulate the transported data, the need for implementing such technologies in a multi-hop distributed environment brings in complexities around governance and sustenance.

Disparate technologies among collaborating entities:
Another effect of multiple collaborating entities in the value chain is presence of disparate technologies. While technologies in the back end with most sponsors is typically mainframe environment, applications with open standards such as J2EE and .NET along with message oriented middlewares (MOM) are widely used in financial transaction application worldwide. The SOA and WebServices technologies are expected to provide interoperability for seamless transaction processing in the Wealth Management solution. However, security approaches in the WebServices area are still in the early stages of being tried out. Full encryption of transactions makes them heavy and hence marshalling/unmarshalling events while the transactions gets processed within services impact the performance levels thus impacting the SLA/QOS. Selective encryption guidelines through standards like WS-* are being defined – however, adoption of such standards will take some time.

Data Security - California SB1386 – Risk/Mitigation analysis


As a Technology Officer, I consult with Aerospace and Automotive clients. I came across this security regulation publication - California SB1386. I foresee the following as the potential areas and effects that one will encounter - particularly in offshore/onshore kind of consulting environment. In the following sections I analyze and present my views and also brief the risks and mitigation on the issues.


Design of new systems that’ll interact with HR systems containing personal information of employees
Risk: There are chances that personal data captured in the new local system either in persistent or cached mode. Possible that if the data is not encrypted, may lead to potential view and misuse by the system designer/developer
Mitigation: Analyze the need for transfer of personal data in the design stage and eliminate packets of data that contains personal data. If access to personal data is essential in the new system, design with encryption mechanisms in cached and in persistent modes for the duration of entire lifecycle of the data.

Involvement of human resources that interact with systems that contain personal data ( employees or customers)

Risk: Deliberate or inadvertent misuse of personal data – for example, leaving hand written notes behind after testing the system or copying data sets on to a personal laptop/desktop that doesn’t belong to the customer system environment.
Mitigation: Have the developers and designers involved undergo necessary training about personal data regulation and bring them under appropriate governance structure like signing contract on personal data usage. Also, ensure necessary notification mechanisms are developed and tested part of the enterprise governance structure with regards to notifying the affected parties when potential misuse or loss of personal data event occurs.

Design of systems that interact with third party partner systems that transmits personal data

Risk: Potential transfer of company’s data set containing personal data and vice-versa. Like in issue one, the data may be stored persistently or cached on to a local system.
Mitigation: As in Issue 1, analyze and eliminate the need for personal data transfers or implement appropriate encryption mechanisms. Mitigation factors described in the Issue 2 also are applicable here.

Human resources that involves in transporting personal data in a mobile devices such as laptops and PDAs

Risk: Potential for loss of mobile devices containing personal data – for example, laptop thefts which are increasingly common these days.
Mitigation: Define and implement elements pertaining to transport of personal data on mobile devices in to data security governance structure. For example, define levels of employees that are eligible to transfer and carry personal data on mobile devices; provide education on risks of carrying personal data and mitigation factors to employees (for example – direct the employees that the mobile devices should be in their personal reach at all times during the times the device is carried outside the work location; should not check-in the laptop bag while flying etc). Define and enforce strong encryption process when data is transferred from company’s systems to laptop – ensure that the laptops don’t transfer any external system data without encryption. Define and implement dataset self destruction mechanisms – for example if the data is not accessed for stipulated time, then the dataset gets automatically deleted and/or becomes unusable.

Consultants giving out personal data to customer company – for example providing name, dob and SSN to obtain badges from customer company during onsite visit
Risk: Misuse of personal data that is transmitted over fax or paper documents that is sent by mail/courier. Consultants visiting onsite from India may not be aware of identity theft aspects in the US as such concepts are not prevalent in emerging countries like India yet. Such consultants may inadvertently leave paper trails in public places giving way for identity theft.
Mitigation: Train the employees on the risk and mitigation of providing personal data information to visiting onsite companies as well as being negligent about inadvertent exposure of such data in public places. Educate them on processes in seeking help when in need. Mitigation mechanisms described in the Issue 2 also should be included here (like governance structure, notification mechanisms etc)

Offshore consultants that potentially interact with system design/development with test data that contains personal information

Risk: As in Issue 2 but in this case it may have far reaching effects – for example, SSN and credit card numbers exposed in a different country could lead to gross misuse of information to buy merchandise or services online in a short time without traceability that might be possible within the US jurisdiction.
Mitigation: Implement elements in governance structure to completely restrict transmission of personal data to offshore locations. For testing purposes, only fictitious data needs to be created at offshore facilities. Onsite coordinators should be trained on the governance structure, risks and mitigation to eliminate the risks.


And you thought COBOL is dead?



Think again! Contrary to a popular misconception by the current generation of programmers that are deep into open systems languages such as Java/J2EE, .NET and faintly familiar C++, COBOL is growing at a rate of 3% to 5%.

No one is learning COBOL in school anymore, and new applications aren’t built on COBOL any more – it’s like Latin over English! Yet, the growth is real – How? Maintenance of existing mainframe programs. It makes sense, right? Thousands and thousands of legacy applications constantly go through enhancements including change in core business logic to accommodate changing business needs.

Having written over the past 40 years, , the total value of the COBOL applications residing on mainframes today exceeds $1 Trillion (ref: Computerworld April 24, 2006). Some amount of PL/1 and 4GL languages are in the mix, but COBOL’s has a lion’s share.

Sure thing, there are efforts around the globe to modernize legacy systems from mainframe to distributed systems. But, most of the effort is still in either infancy stage or early on thought process. You would think that today’s distributed systems offer scalability and manageability – but the five 9s of reliability and the blazing performance these mainframe systems offer for a complex transaction processing application, today’s distributed system just don’t stand a chance to match. For example, when I worked to rearchitect and prove a Model-Type-Option computation system for a large automotive company in Japan on a Java based custom system, the performance of the new system stood at about one-twentieth of what their mainframe system offered.

Just to give you all a flavor on how those white elephants are recognized, there are three categories of mainframe systems:
(a) Under 500 MIPS; (MIPS = Millions of Instructions Per Second)
(b) Between 500 and 1000 MIPS and
(c) Over 1000 MIPS.

Of these, enterprises those use systems of over 1000 MIPS, are not even willing to touch the big irons for modernization – particularly where the computing algorithms are complex non-linear ones. The category (b) is a grey area – CIOs/CTOs are brooding, waiting and watching. Category (a) is where the thought processes and early attempts are focused at and taking some shape. Enterprises are willing to identify applications that are less critical in nature under category (a) and make an attempt to modernize them on either J2EE or .NET based platforms.

So, you wonder: where are the high end big irons heading? For now, it is predicted that they are there to stay – for a decade or even two. However, there is some form of transformation strategy that is being mulled around:
(1) Enterprises that consider the current user interface as a clunky tab-tab-tab text screens, are looking for technologies to develop web user interfaces. Screen scraping tools – a sort of pig-with-lipstick! – make a living out of this approach.
(2) Organizations that consider the business processes and the data needed exposure to other systems or other parts of their business, are looking at SOA as an option. But it’s a long way to go.

Now you got a glimpse of the landscape – sort of Back-to-the-future Part II. Here’s a thought for those who want to be popular and most sought after in the coming years: If you are a J2EE/.NET/SOA junkie, with experience in COBOL/CICS/Mainframe, get yourself familiar with legacy transformation strategy. Who knows, you might take a “consultant” avatar!

TRUE or FALSE : SOA cannot be implemented without WebServices?



If you are ready with your answer, hold on to it…. Lets validate towards the end of this note.

Last week I met with couple of my customer architects in one of the technology roadshows. When one of the researchers mentioned that SOA is much broader than a technology concept and WebServices is just one of the technology enablers, a customer architect got agitated and argued vehemently against these notions. He later summed up in an email : "my only point is that if we decide that SOA is not about technology or webservices etc, then perhaps this topic should be discussed somewhere else (not between architects and researchers) .. I apologize if I wasn’t clear." This quasi confrontation, which I think is a healthy one, triggered some thoughts in my mind the result of which is the above quiz.

Here are my thoughts …. Before going into details below, lets keep one thing in mind loud and clear: SOA (Services Oriented Architecture) is about "Sharing of Services". It’s about reusability, repeatability, and maintainability!

WebServices are NOT essential to implement SOA: if you operate in a homogenous environment. Here's my explanation ...


Most of us techies, that work on OOAD principles Java/J2EE, C++ or .NET, can clearly visualize what I am talking about. If not, here's an example: when you write code, don’t you organize repeated calls to a specific algorithm to a separate function or a method?


Now expand the thought further. While developing a comprehensive application, you must have used common exception handling and security services across multiple different modules. Do you agree that's a shared-services approach?

Now, broaden that same thought to multiple applications in an environment that is pure J2EE. Cant one application communicate with other application to leverage some of its "services". Lets take a specific example - lets consider two different applications

Application 1 - Part Inventory System written in J2EE that has a method to query the database for available inventory.

Application 2 - Spare Parts Management system under development in J2EE, requires inventory information. In this scenario, cant we leverage the query function along with the data available in Application1? So, aren't we using the shared services approach?

In the scenarios described above, reusability, repeatability and manageability led to shared-services approach which I think is the foundation of Service Oriented Approach. In my mind, service oriented thought process started and proliferated when client-server computing emerged. Later when distributed computing gained pace, much more clarity got added to service oriented thinking through shared-services approach.

The key constraint here is that we operated in a homogenous environment! When you operate within a homogeneous environment where visibility among systems is not an issue, (same OS, networking infrastructure, common communication protocol etc), two disparate application can "share-their-services"

WebServices ARE essential to implement SOA: if the environment changes to a heterogeneous one, then YES.

So, where do we use WebServices? In a heterogeneous environment that requires two different technology stacks to interoperate or where there is no visibility among systems due to enterprise boundaries. A simple example of heterogeneous environment: J2EE and .NET. An example for no-visibility situations: two J2EE systems at different enterprises that are business partners that agree to share information. Here WebServices with the concept of service-provider, service-consumer and service-directory comes into play.

To conclude, WebServices are NOT essential if SOA is thought in a visible-homogeneous environment. But, WebServices are essential in a heterogeneous technology platforms or inter-enterprise environments.

So, the answer to my question: Neither true nor false.

What do you all think?


Value of IT Architecture Discipline to Businesses


Introduction: Having spent significant time in thinking on IT Solutions and Architectures, I thought of publishing my perspectives on IT Architecture and its value to enterprises .. so here it is.

What is expected of IT?

Gone are the days where an IT function in an enterprise is viewed as a department full of software engineers that make software application systems and computers run. It’s not a black box environment anymore where significant amount of budget is sent in to get some business systems for employees to use. These days, IT function is expected to show strategic value and not function just as a support department. In my mind, the very reason for IT organization’s existence in an enterprise is to contribute to the share holder value creation process – in terms of real dollars, it can either be by improving the operational efficiency or by improving productivity …

In short, Business-to-IT alignment is an expectation in enterprises today. For this alignment to happen, the IT department needs to do the following at the least:

(a) Create an efficient IT environment: The IT department has to function efficiently itself – meaning try to do-more-with-less and practice-before-you-preach approach
(b) Develop systems that are agile and scalable: The IT organization should nurture tighter working relationship with lines of business to develop and deliver IT systems that are agile in nature

IT Architecture’s role in creating alignment between Business and IT:

Over the years and decades, while IT organization worked as a support organization, silos of business systems were delivered on variety of technology platforms. Most of the businesses systems worked within a vertical with well defined boundaries thus establishing islands of automation. While some systems addressed overlapping business processes, other systems provided solutions for disconnected enterprise processes.

The fundamental value that IT Architecture discipline brings to table is the ability look business systems at multiple dimensions - be it at the enterprise level or at a system level. If an analogy can be used, its like building set of houses in a community and town planning for the community. Here houses denote individual business systems and town planning denote an enterprise.

Here’s what Tony Scott, CTO of General Motors says about the need of architecture discipline in an enterprise: “Companies that operate without an architectural approach end up like Gulliver, tied down by tens of thousands of Lilliputian strings and wires. If he's going to move, you have to cut 10,000 strings. If the company practices enterprise architecture, you will have fewer strings to cut and more freedom of movement”.

Help Create an efficient IT Environment by reducing complexity:

At the very core of an efficient IT environment is standardization and redundancy elimination. It’s the architecture discipline that brings in the concept of shared services, re-usability, common processes, and common infrastructure for individual systems to function efficiently.

By promoting practices such as shared services and common infrastructure, complexity can be largely reduced thus improving manageability at reduced cost of operations. For instance, consider a hypothetical scenario in where an enterprise that runs HR systems that run on BEA Application Server with portal as front end, Financial systems that run on IBM Application Server with their portal and Manufacturing systems that run on Oracle Application Server and portal software. The complexity here is multifold. Just from technology infrastructure standpoint, there are three different application server products and three different portals to maintain. This requires teams of support staff with different skill sets and potentially different set of computing resources to avoid support issues. A prevailing architecture discipline would have consolidated on a single enterprise portal and one application server software, brought in shared services approach from resources (human and computing) thus reducing the complexity and improving manageability.

Help Develop systems that are agile and scalable:

At the center of an agile and scalable system is ability for a deep understanding of business requirements not just from the functional aspects of a system but also from quality of service perspectives such as integrate-ability and interoperability. In addition, an ability to visualize how the system needs to behave/evolve in constantly changing technology environment is required.

Its not “you give the requirements, we’ll do the coding” relationship between IT and Lines of Businesses anymore. In other words, it’s not just sufficient for IT to take care of the “technology” aspects of systems development and maintenance, but also play a role in defining the system for cross functional integration thus removing barrier between business functions in an enterprise.

It’s the architecture disciplines and multi dimensional governance that brings an ability to define a business system that’ll satisfy multiple facets. The architecture discipline instills a service-oriented thinking thus bringing in ability to define and develop agile systems – both from business process standpoint and technology standpoint.

Challenges:

Organizational Level Challenge: Typically, there will be lot of skepticism and resistance for instilling IT Architecture discipline – particularly grunts at enterprise level is that “enterprise architecture is a boil-the-ocean process: We're going to send people out for training, and then we're going to produce reams of paper, then contemplate and it will be several years before there are any tangible results”. Architecture is considered a waste of time activity even at a program level by middle level IT managers that are focused on developing and delivering the systems “on time”.

Individual Level Challenge: In my mind, an architect requires multi-dimensional skills. Any experienced architect knows that the role involves not just the technical activities, but others that are more political and strategic in nature on the one hand, and more like those of a consultant, on the other. A sound sense of business and technical strategy is required to envision the "right" architectural approach to the customer's problem set, given the business objectives of the architect's organization. Activities in this area include the creation of technology roadmaps, making assertions about technology directions and determining their consequences for the technical strategy and hence architectural approach. Good architects those posses all these skills are very hard to come-by.

I’ll write more about the challenges, approaches to mitigate, tools available out there etc, in my next article …