LinkedIn emplea cookies para mejorar la funcionalidad y el rendimiento de nuestro sitio web, así como para ofrecer publicidad relevante. Si continúas navegando por ese sitio web, aceptas el uso de cookies. Consulta nuestras Condiciones de uso y nuestra Política de privacidad para más información.
LinkedIn emplea cookies para mejorar la funcionalidad y el rendimiento de nuestro sitio web, así como para ofrecer publicidad relevante. Si continúas navegando por ese sitio web, aceptas el uso de cookies. Consulta nuestra Política de privacidad y nuestras Condiciones de uso para más información.
Guidelines to determine the right interface when
integrating with SAP systems
The purpose of this document is to help and guide projects with application
integration components to determine the right interface when communicating with
The main input to determine the right interface is the business requirements. These
requirements need to be translated into integration requirements and agreed with
the business and the IT parts.
Depending upon the requirements on the integration flow (e.g. Guaranteed Delivery,
messages, the reliability and performance of the interface), the right
SAP interface should be selected when the project designs the solution.
The project needs to map the requirements in the left side with the accurate
interface in the right side, see the following figure:
Common Quality Attributes
There are several types of SAP interfaces to choose from (see the table).
We must consider at least two cases:
Case 1: If requirements include Guaranteed Delivery, Transactional
messages); the solution have to use either: IDocs, ALE, or WebServices with Reliable
Messaging or some types of BAPI/RFC.
Case 2: If the requirements are less strict and the consequences of losing messages
are not that serious; it is permissible to use File/FTP integration or WebServices
without Reliable Messaging.
Transactionality: In this scope we define Transactionality as ‘the ability to ensure that a message is conveyed in its entirety’. The whole
message or nothing at all is sent.
2.1 Background and problem
• Applications in today’s system landscape are heavily connected to each
other. Processes crossing over different companies (e.g. switching electricity
providers, Meter Reading and Billing, Supply Chain Management Processes
etc.). Information need to be exchanged between the corresponding systems
that run and support these processes.
• A System (e.g. SAP ERP) is the master source of many information objects
(e.g. Products, Product Prices, Invoices, Payments etc.). Other processes and
internal/external systems are depended on the information (data) that is
stored and owned by the SAP ERP system. In order to feed these systems
(e.g. Business Intelligence systems, CRM, SRM, other external and internal
systems) with the required data, integration flows between these
applications or systems should be developed.
• A system like SAP ERP requires to get the right information from other
systems (e.g. Meter Reading, CRM, SRM,…) in order to run the processes (e.g.
Billing the customers) correctly.
• As we see, interoperability is a very important capability and it is required by
• Projects have to deliver their outcomes in time and according to the budget
and the agreed quality. Unfortunately many projects detect the integration
needs when they are in a late phase of the project. Often these projects
strive to solve the integration needs without analyzing the business
requirements for these integration flows. The main concern of a project
manager is to avoid delays in the projects. The consequences of a quick
design of these integration flows could be a rework of these flows or more
serious consequences (e.g. lose profits, lose reputation) by losing the
messages or sending message more than one time.
• Often projects do not consider other parameters that characterize a future-
o Maintainability, simplicity of management (e.g. debugging,
monitoring, exception handling etc)
o Configurability (using Scheduling mechanism)
o Adaptability to future needs
o Security; transport with secure protocols
o Savings in development and maintenance
o Compliance with the SAP standard solution (in some cases
applications has a SAP certified interface).
• It is not an easy issue to determine the right interface for integration
between two or more systems, especially when the systems have many
alternative interfaces. At the beginning, when we as integration developers
and integration architects, started to create integration flows between our
SAP systems with the rest of the world in 2004, we made many mistakes and
we made many reworks and redesigns and of course we learned a lot of
lessons from these mistakes. I hope that this document will help you to avoid
some reworks and help you to take the right design decision when it comes
to integration with SAP systems.
• Many companies use SAP products as (e.g. their ERP system). In our
organization we have a long experience of using SAP systems. I wrote the first
version of this document in 2007 as a one page-document and I made several
versions of this document in 2008, 2009 and 2010 when I was a solution
architect and the responsible for the EAI (Enterprise Application Integration)
area in our organization. During these years the document was used by many
integration/solution architects and developers to determine the right
interface in their projects.
• In my opinion, the main tasks of a integration architect is to:
o Design an integration solution that is according to the business
o Connecting the right systems that contains the right/required
information in a right way
There are many conditions and constraints that matter when determining the right
• The availability of an EAI tool in the organization (e.g. BizTalk Server 20XX,
SAP PI 7.X and the existence of SAP adapters to handle IDocs, BAPI and RFC
• If the sender and the receiver system belong to different security
• If the sender or the receiver system can communicate only via File interface
• The SAP version(s) that are installed in the organization. This document is
based on general interfaces (e.g.: BAPI/RFC, IDocs, ALE) that are provided by
many SAP versions (e.g. SAP R/3, My SAP ERP Edition 2004, SAP ERP 6.0,…).
In order to create a right content in this document, I made some assumptions based
on the previous points. These assumptions may not be necessarily incorporated in
the sender or the receiver systems’ IT-landscape.
• The IT-landscape includes an EAI platform. This assumption does not mean
that every interface goes via an EAI platform. The document assumes that
there are clear definitions, when an EAI platform will be used. The criteria for
selection the right interface are not depending on using or not using an
• The sender and the receiver system belong to one security zone.
• The sender and/or the receiver system can communicate by other types of
interfaces than just by the File interface.
• The sender and/or the receiver system can implement some or all the
general SAP interfaces.
• The purpose of this document is to help and guide projects with application
integration components to formulate design decisions and determine the
right interface when communicating with SAP systems. The main input to
determine the right interface is the business requirements. These
requirements need to be translated to integration requirements.
The current document answers the question:
• How to determine a suitable interface to use when integrating to SAP?
The following topics are not included in the guideline document:
o When to use an Enterprise Application Integration (EAI) tool and
when it is allowed to not use it. This issue is a matter for a new work
(guidelines) in the future.
o Different versions of SAP and their impacts on the SAP provided
o The roadmap of SAP interfaces/integration platform.
o The impacts of using custom IDocs, custom BAPIs and custom RFCs
when the project does not find a standard IDocs, BAPI or RFC.
o Setup and configuration of RFC, IDocs and other connections.
o SAP Enterprise Service Oriented Architecture (or Enterprise SOA)
This document is intended for project managers, developers, business architects,
solution architects, software experts, software architects, and anyone responsible or
involved in taking design decisions when there are needs for integrating SAP systems
with other SAP or non-SAP systems.
3. The Method
The method used in this document is based on PDCA (Plan, Do, Check and Act)
method. We use this method to gather and agree about requirements, plan and
design the solution, implement, test and act if there are any gaps between the
requirements and the outcomes of the integration solution. The PDCA cycle is widely
used in industry, lines of business to improve processes and as a tool for problem
1. I added the stakeholder and stakeholders’ needs as new component
in the center of the PDCA method. Stakeholders are the start point and
the end point of every assignment. The main task of a project manager is
• Ensure the satisfaction of the stakeholders.
• Translation of the stakeholders’ needs to understandable
requirements in order to ensure a correct delivery
2. Plans will start with scoping the goal, objectives and requirements on the
3. Do according to the plan.
4. Check if there are some gaps between the required outcomes and the
5. We may need to take more actions to correct the result.
Figure 1: PDCA with stakeholders management
3.1 Stakeholders and Stakeholders’ needs
If we go back to our method, we put our stakeholders and their needs in the middle
of the PDCA method and we decided to start with the stakeholders. The first step is
to identify these stakeholders. Some main principles for managing your
• If you have many stakeholders and if there are risks for conflicts between
different interests and expectations, then you may need to analyze your
stakeholders’ power and influence in your assignment.
• Clarify and meet their expectations
• At the end you may not satisfy all your stakeholders. The key stakeholders
with the most power and influence are very important to concentrate on.
Identifying stakeholders is not an easy task and you as a project manager have to
seek proactively for new stakeholders. Face to face meeting with stakeholders is a
good way to obtain and gather their concerns. A good way to obtain commitment is
to actively engage your stakeholders. You need to build trust with your stakeholders.
The question here will be who are the main stakeholders for the integration
At the beginning of a project you as a project manager may not identified all the
stakeholders of the project. As project manager you need to find an integration
architect t in order to analyze your project from an integration point of view. For an
integration architect it is important to understand the boundaries of the project and
the processes, the information and as we mentioned previously the IT landscape that
will be impacted by the project.
When the impacted processes and information are identified then the architects can
start with performing the map and creating the process/system and
information/system matrixes and views. These deliverables are very important in
• Define the information to be exchanged and identify the information flows.
• Define the security and other quality attributes requirements (e.g.
guaranteed delivery, …)
The stakeholders for the integration requirements will be the process and the
information owners, security officers, IT (infrastructure, developers,…), the project
manager and sponsor.
As an integration architect or integration project manager (if that is needed as a sub-
project) you may involve the users in order to check their view over the
requirements and the desired outcomes of the integration solution. The list of the
stakeholders could be much longer or shorter depending on the size and the impacts
of the project.
As an integration architect you need to gather and analyze the relevant
requirements from the involved stakeholders. In many cases you may need to
perform a pre-study to check the feasibility and the costs, impacts and the risks of
Nothing is new here, as requirement manager you need to identify, specify, analyze
and agree about the use cases, the quality attributes and other requirements for
your project. And as integration architect you need to translate these requirement
(and maybe specify and detail more) these requirements into integration
3.1.1 The requirements of a project
The requirements on SAP integration or on a specific information flow that involves
SAP, determine what SAP interface is the most suitable to fulfil the requirements.
As an integration/solution architect or a developer with the responsibility to provide
an accurate solution to the agreed requirements, you may need to have a holistic
view over the requirements, especially if the project is very complex and involves
many applications or system (e.g. implementing a new SAP ERP system and
connecting the new system to internal and external systems). In this case and if you
are the responsible for providing the integration solution, you will need to
understand the processes and information needs behind these processes. Such kind
of projects (e.g. integrating a new billing and CRM system to other internal and
external systems) requires efforts from many business/IT architects, project
managers, developers and others.
As we mentioned in the previous section if the list of the use cases and the quality
attributes are ready (gathered, specified, analysed and agreed) and the integration
architect was involved in this process, then the translation to integration
requirements will be easier than starting from nothing.
The following is a list of required quality attributes that have the impacts in
integration requirements and the integration solution:
Reliability According to , reliability is the ability of a system to remain
operational over time. Reliability is measured as the probability
that a system will not fail to perform its intended functions over a
specified time interval.
According to , performance is an indication of the
responsiveness of a system to execute any action within a given
time interval. It can be measured in terms of latency or
throughput. Latency is the time taken to respond to any event.
Throughput is the number of events that take place within a given
amount of time.
Using an integration platform like BizTalk Server and developing a
messaging solution that uses the BizTalk message box will reduce
the responsiveness of the system.
Security Security is the capability of a system to prevent malicious or
accidental actions outside of the designed usage, and to prevent
disclosure or loss of information. A secure system aims to protect
assets and prevent unauthorized modification of information, for
more information, see .
In many cases the EAIP (Enterprise Application Integration
Platform) is used as an interface to external systems. The main
reason is that an EAIP system can handle the connectivity and
receiving/sending messages in different formats with different
protocols. Often these systems are designed with security in
Interoperability According to , interoperability is the ability of a system or
different systems to operate successfully by communicating and
exchanging information with other external systems written and
run by external parties. An interoperable system makes it easier
to exchange and reuse information internally as well as externally.
In these guidelines and the cases that are presented In this
document, we assume that these projects have interoperability
Maintainability According to , maintainability is the ability of the system to
undergo changes with a degree of ease. These changes could
impact components, services, features, and interfaces when
adding or changing the functionality, fixing errors, and meeting
new business requirements.
It is important to analyze if the sender and the receiver systems
have integration capabilities out the box. Otherwise developing
integration capabilities in the sender and the receiver systems
could make these systems very complex and hard to maintain.
Even complex Integration flows developed in an EAIP are easy to
maintain and maybe reuse.
Reusability According to , reusability defines the capability for
components and subsystems to be suitable for use in other
applications and in other scenarios. Reusability minimizes the
duplication of components and also the implementation time.
Creating Canonical Data Models helps organisations to reuse
these models, reduce the amount of mappings and to have a
common understanding of the data.
Message traceability is an out of the box capability of an EAIP and
it enables tracing of metadata for messages sent through the
integration platform. Ability to archive full messages or searching
for messages by content are also capabilities that some of these
The Message Router pattern determines the recipient of the
message based on a set of conditions.
The Content-Based Router inspects the content of a message and
routes it to another channel based on the content of the message.
Using such a router enables the message producer to send
messages to a single channel and leave it to the Content-Based
Router to inspect messages and route them to the proper
destination. This alleviates the sending application from this task
and avoids coupling the message producer to specific destination
Basic Message Router uses fixed rules to determine the
destination of an incoming message. Where we need more
flexibility, a Dynamic Router can be very useful. This router allows
the routing logic to be modified by sending control messages to a
designated control port. The dynamic nature of the Dynamic
Router can be combined with most forms of the Message Router,
for more information, see .
This integration capability can be difficult to develop in a sender
system (we assume that the sender system is not an EAIP). If the
requirements are indicating that a there are needs to route
messages to different systems, it is wise to use an EAIP.
Orchestration or workflow capability enables multiple information
flows to work as one where the process is coordinated within the
integration platform. In common cases this is used for workflow
logic i.e. which applications to call in which order and with what
messages, but in more specific cases it can also be extended with
business logic i.e. when the required business logic is not present
in any one of the integrated applications. As the previous
capability, this capability is an EAIP capability.
In many cases, enterprise integration solutions receive, send and
route messages between existing applications such as legacy
systems, packaged applications, homegrown custom applications,
or applications operated by external partners. Each of these
applications is usually built around a proprietary data model. Each
application may have a slightly different notion of the Customer
entity, the attributes that define a Customer and which other
entities a Customer is related to. For example, the accounting
system may be more interested in the customer's tax payer ID
numbers while the customer-relationship management (CRM)
system stores phone numbers and addresses. The application’s
underlying data model usually drives the design of the physical
database schema, an interface file format or a programming
interface (API) -- those entities that an integration solution has to
interface with. As a result, the applications expect to receive
messages that mimic the application's internal data format, for
more information, see .
Transformation is one of the main capabilities of a EAIP.
Table 1: quality attributes that have the impacts in integration solution
Interoperability is one of the main required capabilities in every enterprise. Projects
with IT components (e.g. purchasing a Supplier Relationship Management system, a
Work Order Management system etc.) need to define and analyze the impacts of the
current and future IT landscape on the purchased system and the impacts of the
purchased system on the current and future IT landscape. This is important to align
the purchased systems to the target architecture of the enterprise. When a project
with interoperability requirements starts at some part of your organization then
there will be needs to gather and analyze the integration requirements. Regardless
of the setup of your organization, as a project manager you will need to appoint an
integration architect/expert to specify the integration requirements on the project’s
integration solution. As we mentioned at the beginning of this document, the
integration team or an integration architect should be involved at the beginning of a
Below is a basic questionnaire for any project that is about to integrate any two or
more applications/systems. The questionnaire defines the general requirements on
the information flow. The integration team asks (sends) these questions when they
receives a request for integration from a project .
The following table lists the relevant questions that should be asked by the project to
our stakeholders. The related requirements column will be very useful for the
project to design and determine the right interface (see the matrix on page 19). You
may need to add new questions and add new column “Stakeholder” and direct every
question to a specific stakeholder or a group of stakeholders.
Is it necessary to have ordered delivery of messages? Ordered
How critical is the information and what are the consequences of
losing or delaying messages? (Cost, quality, reputation etc)
Do the process and the information flow involves many systems? Interoperability,
Are there any requirements for rolling back transactions when a
part of transaction scope fails?
Does the information have to have a request/response pattern
or any other specialized communication pattern (e.g.
synchronous / asynchronous, publish/subscribe, etc.)?
What latency (delay) can be accepted for the delivery of
Are there any special security requirements (on authentication,
- This involves mechanisms like encryption, acknowledges, etc.
which all have an impact on cost and the complexity of the
How much data will be transferred per time (day, week and
month) in kilobytes and in business related
numbers/transactions (# of invoices, orders, meter readings,
large amount of
Table 2: Relevant questions that should be asked to the integration solution
The answers to the above table’s questions define which requirements on the
information flow are critical and should be considered by the project.
The following requirements are the most important requirements when determining
the right interface:
o Transactional communication
o Guaranteed Delivery
o Performance (low latency)
o In order Delivery
3.2 Definitions, Acronyms and Abbreviations
We start with defining the main terms that we will use in the rest of this document.
If you are familiar with these SAP terms and the other terms that are listed and
defined in the next pages, you may jump to the next section of this document.
3.2.1 SAP related Definitions
Remote Function Call is the technology that allows calling a function module within a
SAP system from another SAP instance or from an external program.This method is
used for event driven Architecture and for Synchronous Communication. The calling
system is the RFC client; the system called is the RFC server. RFC is based on the RPC
(remote procedure call) model of the UNIX TCP/IP environment. RFCs manage the
communication process, parameter transfer, and error handling. Before an RFC
model can be called from an SAP system, the import/export parameters must be
known, and a technical connection must be exist. This is known as an RFC connection
or RFC destination. For more information, see .
The following figure illustrates the types of RFCs that are explained in the next pages:
Figure 2: The RFC types
The first version of RFC is synchronous RFC (sRFC). This type of RFC executes the
function call based on synchronous communication, meaning that the systems
involved must both be available at the time the call is made.
Transactional RFC (tRFC)
Transactional RFC (tRFC, previously known as asynchronous RFC) is an asynchronous
communication method that executes the called function module just once in the
RFC server. The remote system need not be available at the time when the RFC client
program is executing a tRFC. The tRFC component stores the called RFC function,
together with the corresponding data, in the SAP database under a unique
transaction ID (TID).
If a call is sent, and the receiving system is down, the call remains in the local queue.
The calling dialog program can proceed without waiting to see whether the remote
call was successful. If the receiving system does not become active within a certain
amount of time, the call is scheduled to run in batch.
tRFC is always used if a function is executed as a Logical Unit of Work (LUW). Within
a LUW, all calls
o Are executed in the order in which they are called
o Are executed in the same program context in the target system
o Run as a single transaction: they are either committed or rolled back as a
Implementation of tRFC is recommended if you want to maintain the transactional
sequence of the calls.
o Disadvantages of tRFC
o tRFC processes all LUWs independently of one another. Due to the
amount of activated tRFC processes, this procedure can reduce
performance significantly in both the send and the target systems.
o In addition, the sequence of LUWs defined in the application cannot
be kept. It is therefore impossible to guarantee that the transactions
will be executed in the sequence dictated by the application. The only
thing that can be guaranteed is that all LUWs are transferred sooner
Queued RFC (qRFC)
To guarantee that multiple LUWs are processed in the order specified by the
application, tRFC can be serialized using queues (inbound and outbound queues).
This type of RFC is called queued RFC (qRFC).
qRFC is therefore an extension of tRFC. It transfers an LUW (transaction) only if it has
no predecessors (based on the sequence defined in different application programs)
in the participating queues.
Implementation of qRFC is recommended if you want to guarantee that several
transactions are processed in a predefined order.
For more information, see SAP Help portal:
BAPI function modules are special function modules that are released by SAP as an
official application-programming interface, and they are supported for public use.
Hence, BAPIs are a subset of the RFC enabled function modules. BAPIs are
implemented and stored as RFC-enabled function modules in the ABAP Workbench
and can also be used as a basis for WebServices. This function is used for
BAPIs play an important role in technical integration and business data exchange
between SAP components and external components. For more information, see .
o BAPIs can tell if the sending was successful or not
o Sending to/processing on the other side is immediate
o Easier to create custom BAPI than custom IDoc
o Will only work if there is an active online connection.
o Some programming required to call a BAPI.
We need to explain the differences between BAPIs and RFCs. This is very important
due to the availability of the RFCs and BAPIs to be accessed by a non-SAP system.
BAPIs and especially standard BAPIs (not customized) are by default available to be
accessed (called) by a non-SAP system. RFCs by default are only available to be
accessed by a SAP system.
Technically, there is no difference between RFCs and BAPIs, but SAP wants you to
use a BAPI to access R/3 functionality from external programs, whenever a BAPI
It is not possible to connect SAP to Non-SAP systems to retrieve data using RFC
alone. RFCs can access the SAP from outside only through BAPI and same is for vice
versa access, for more information see .
Intermediate Documents are a transport vehicle for data transfer in and out of an
SAP system. The data transferred between the systems is in the IDoc format, which is
a text format. This method is used for:
• Event driven Architecture and for asynchronous Communication.
• Sharing information by messaging
Different message types (such as delivery notes and purchase orders) normally
represent the different specific formats, known as IDoc types. Business data is
exchanged between systems by means of electronic data interchange (EDI) and
application link enabling (ALE). The IDoc interface, in which a data structure and
corresponding processing logic is defined, is required for both forms of data transfer.
The structure of the IDoc is described by the IDoc type. This contains, for example,
information about where data is stored. For more information, see .
Benefits of IDocs are: they are event driven, they are readable in the case of a
malfunction, they ensure transactionality and traceability and they offer a future-
proof solution, especially if the destination system wants to receive the information
It should be taken into consideration that to make system and IDoc processing
optimal, IDocs should be processed in the background, otherwise the system can be
blocked by too many IDocs all using dialog processes.
o System can work even if target system not always online. The IDoc will
be created and sending will just continue once you get connected to
the other system.
o No additional programming required. You just need to set up the
o Receipt/processing on the target system may not be immediate.
o The sending system has no way to know whether the target system
actually received what you sent (unless you use ALE).
o If there is no SAP standard IDoc available for your requirement, it's
harder to create a customized IDoc than a customized BAPI.
o The output of an IDoc structure schema “the flat-file” will be filled
with zeros for empty positions in a fileds and the rest of a partly filled
field, and therefore IDocs can be very bandwidth-intensive.
ALE technology: Application Link Enabling (ALE) is a technology to create and run
ALE Objectives - ALE incorporates controlled exchange of data messages ensuring
data consistency across loosely coupled applications. Basic principle of ALE is to
provide a distributed and fully integrated R/3 system.
Difference between ALE and EDI: Normally we refer to EDI technology when a non-
SAP system is one of the communication channel. ALE communication occurs from
the SAP side and EDI from the non-SAP side. IDocs use ALE and EDI to deliver the
data to the receiving system. If the data needs to be exchanged between two SAP
systems, then IDOC uses ALE technology. For the exchange of data between a SAP
and Non SAP system, IDOC uses EDI subsystem to convert and deliver the data, see
ALE is a slightly older but still valid means of creating and operating distributed
applications involving the monitored exchange of consistent data between systems.
ALE consists of application services, distribution services, and communication
services. Both master data and application data can be exchanged using ALE. For
more information, see .
o ALE offers better inbound interface performance compared to
traditional techniques such as Batch Data Communications (BDC) or
call Transactions. ALE does not use screen-based batch input.
o ALE is provides “loose coupling” with legacy and third-party
applications and is a Business Framework key element. It provides a
message-based architecture for asynchronous integration of Business
Framework components, including Business Components, Business
Objects and BAPIs, for more information, see .
o The disadvantage of asynchronous communications via ALE is that
only a single return parameter from the called system is available, see
3.2.2 Other Definitions
Many of the following definitions are based on the material provided in the following
In our integration framework we used the patterns provided by Gregor Hohpe and
others in many of our integration solutions, see the reference .
126.96.36.199 Adapters and Interfaces
Adapters (Channel Adapters):
Many enterprises use Messaging to integrate multiple, disparate applications.
How can you connect an application to the messaging system so that it can send and
Figure 3: Adapters (Source )
Use a Channel Adapter that can access the application's API or data and publish
messages on a channel based on this data, and that likewise can receive messages
and invoke functionality inside the application.
The adapter acts as a messaging client to the messaging system and invokes
applications functions via an application-supplied interface. This way, any application
can connect to the messaging system and be integrated with other applications as
long as it has a proper Channel Adapter.
Interface or Messaging Bridge:
An enterprise is using Messaging to enable applications to communicate. However,
the enterprise uses more than one messaging system, which confuses the issue of
which messaging system an application should connect to.
How can multiple messaging systems be connected so that messages available on
one are also available on the others?
Figure 4: Interface/Message Bridge (Source )
Use a Messaging Bridge, a connection between messaging systems, to replicate
messages between systems.
Typically, there is no practical way to connect two complete messaging systems, so
instead we connect individual, corresponding channels between the messaging
systems. The Messaging Bridge is a set of Channel Adapters, where the non-
messaging client is actually another messaging system, and where each pair of
adapters connects a pair of corresponding channels. The bridge acts as map from
one set of channels to the other, and also transforms the message format of one
system to the other. The connected channels may be used to transmit messages
between traditional clients of the messaging system, or strictly for messages
intended for other messaging systems.
188.8.131.52 Guaranteed Delivery
Distributed systems over many networks, by default, may not guaranty the delivery
of messages between the involved systems. Guaranteed Delivery is a mechanism
that involves more or less complex behavior and measures (e.g. storing,
acknowledges, using reliable protocols etc) from the sender and the receiver
According to , if the assumption is that the enterprise is using Messaging to
integrate applications. Then the question is how can the sender make sure that a
message will be delivered, even if the messaging system fails?
With Guaranteed Delivery, the messaging system uses a built-in data store, which
enables messages to persist. Each computer the messaging system is installed on,
has its own data store so that the messages can be stored locally. When the sender
sends a message, the send operation does not complete successfully until the
message is safely stored in the sender’s data store. Subsequently, the message is not
deleted from one data store until it is successfully forwarded to and stored in the
next data store. In this way, when the sender successfully sends the message, it is
always stored on disk on at least one computer, see the figure below, until it is
successfully delivered and acknowledged by the receiver. For more information, see
Figure 5: Guaranteed Delivery (Source )
184.108.40.206 Transactional messages
A messaging system, by necessity, uses transactional behavior internally. It may be
useful for an external client to be able to control the scope of the transactions that
impact its behavior. The question is: How can a client control its transactions with
the messaging system?
Both a sender and a receiver can be transactional. With a sender, the message isn’t
“really” added to the channel until the sender commits the transaction. With a
receiver, the message isn’t “really” removed from the channel until the receiver
commits the transaction, see the figure below. A sender that uses explicit
transactions can be used with a receiver that uses implicit transactions, and vise
versa. A single channel might have a combination of implicitly and explicitly
transactional senders; it could also have a combination of receivers. For more
information, see .
Figure 6: Transactional Messaging (Source )
When a sender and/or a receiver use files to integrate their application, they cannot
guarantee the delivery, because files are not transactional.
Transactional behaviour involves complex mechanisms like commit or rollback
To explain more about transactional behaviour, we take another example from the
According to , as per the next figure, Services A and B complete their respective
tasks successfully. However each time they do, they initiate a local transaction,
temporarily saving the current state of the database prior to making their changes
(1, 2). After Service C fails its database update attempt (3), Services A and B restore
their databases back to their original states (4, 5). The business task is effectively
reset or rolled back across services within the pre-defined transaction boundary.
Figure 7: Transactional Messaging (SOA) (Source )
220.127.116.11 Event driven messaging
An application needs to consume Messages as soon as they’re delivered. Here the
question is: How can an application automatically consume messages as they
The application should use an Event-Driven Consumer, one that is automatically
handed messages as they’re delivered on the channel, see the figure below.
This is also known as an asynchronous receiver, because the receiver does not have a
running thread until a callback thread delivers a message. For more information, see
Figure 7: Event Driven Messaging (Source )
18.104.22.168 Synchronous and Asynchronous Communication
Communication between two systems can be basically split into two types:
Synchronous and asynchronous communication. Both forms of communication have
specific advantages and disadvantages, relating to either the business application or
the system administration.
22.214.171.124 Synchronous communication
According to , synchronous communication uses a single function call. A
prerequisite for this is that at the time the call is made (or the message is sent), the
receiving system is also active and can accept the call and further process it if
necessary, see the next figure.
Advantage: Synchronous communication can be implemented in function calls that
require the immediate return of data to the sender system.
Example: You create a purchase order with account assignment in the sender
system, and you want to perform a budget check in central accounting before you
save the purchase order. Another example is when the customer call center checks
the credit of a new customer.
Disadvantage: You need to ensure that both systems are active and can be
contacted. If they are not, this can lead to a serious disruption of processes. In
particular, problems can arise if the receiving system is not available for long periods
of time due to maintenance (for example, for a system upgrade) .
Figure 8: synchronous communication(Source )
126.96.36.199 Asynchronous Communication
According to :
For asynchronous communication, the receiving system does not necessarily have to
be available at the time a function call is dispatched from the sender system. The
receiving system can receive and process the call at a later time. If the receiving
system is not available, the function call remains in the outbound queue of the
sending system, from where the call is repeated at regular intervals until it can be
processed by the receiving system, see the next figure.
Advantages: The receiving system does not have to be available at the time the
function call is made. If the system is unavailable for a long period of time, for
example, for an upgrade, it can still process the data that has been sent in the
interim at a later time, and processes in the sending system remain unharmed.
Example: You are sending a purchase order to a vendor system. The sending system
cannot influence the availability of the receiving system. If the receiving system is
not available, the purchase order can be sent repeatedly until the vendor system is
In asynchronous communication, you usually have the option to send data (for
example, business documents or changes to master data) in packages or individually
(immediately). Note that the Send Immediately option in asynchronous
communication should not be confused with the method synchronous
The advantage of sending data in packages is that system resources are employed
more efficiently, because each function call occupies one work process in the
Example: You want to distribute 100 material master changes to other systems. If
you send the changes in a package (with 100 pieces) you only require one work
process. If you sent the same 100 material master changes individually, you would
need 100 work processes in the system.
When using asynchronous communication, you should therefore always carefully
consider the availability of your system resources and the necessity of immediate
Disadvantage: Processes that require an immediate response to the sender system
cannot be executed using this method.
Figure 9: Asynchronous communication n(Source )
188.8.131.52 WebServices (with Reliable Messaging)
According to :
Whenever you are using Web services in business-critical applications, it is important
that messages are exchanged in a reliable manner. Web Services Reliable Messaging
(WSRM) ensures that message exchange is performed correctly – without messages
getting lost or being duplicated. WSRM ensures a reliable exchange of messages
even when the connection to the network is lost – for example, during a purchase
To guarantee message exchange and also to control the sequence of incoming
messages, the WSRM protocol combines one or multiple messages into sequences.
Sequences contain a unique identifier. Messages in a sequence are numbered
consecutively. The WSRM sequence header in the SOAP message identifies the
sequence to which a message belongs.
WS Reliable Messaging implementations at the sender and receiver sides ensure that
messages are transferred securely. A prerequisite for this is that incoming messages
are confirmed by the receiver. For this purpose, the specification defines the format
of an acknowledgement that the receiver sends to the sender as confirmation. The
sender waits for the confirmation and, if necessary, keeps sending the message until
the confirmation is received.
184.108.40.206 WebServices (without Reliable Messaging)
You can either use the inside-out or outside-in approach to develop a point-to-point
communication using Web Services (more information, see ). Both approaches
work with the WSDL standard (WSDL: Web Service Description Language) to transfer
all the necessary information for calling the service to the service consumer.
The development of point-to-point communication using classic Web Services is
performed using the inside-out approach: a WSDL document, in which all the
necessary information for calling the function is contained, is generated for a pre-
existing function in an application system. Using this WSDL document, consumers
can generate a runtime substitute (a proxy) on their platform and in this way the
consumer application can call the provider’s service.
In outside-in development, you work with service interfaces (outbound and
inbound), which you create directly in the ES (Enterprise Services) Repository.
Service interfaces are language-independent and are based on the WSDL standard.
Using this description, you can generate language-specific proxies, which you can
then use to implement the actual message exchange. The generated proxies are part
of the application and forward calls to the Web Service runtime that executes the
point-to-point message exchange. For more information, see .
In computing, the SSH File Transfer Protocol (sometimes called Secure File Transfer
Protocol or SFTP) is a network protocol that provides file access, file transfer, and file
management functionality over any reliable data stream. It was designed by the
Internet Engineering Task Force (IETF) as an extension of the Secure Shell protocol
(SSH) version 2.0 to provide secure file transfer capability, but is also intended to be
usable with other protocols. The IETF of the Internet Draft states that even though
this protocol is described in the context of the SSH-2 protocol, this protocol is
general and independent of the rest of the SSH2 protocol suite. It could be used in a
number of different applications, such as secure file transfer over Transport Layer
Security (TLS) and transfer of management information in VPN applications.
This protocol assumes that it is run over a secure channel, such as SSH, that the
server has already authenticated the client, and that the identity of the client user is
available to the protocol . SFTP is a viable solution in order to comply with the
For more information about File, see the next section.
Using files as the transport in integration is by far the most common way to convey
information between information systems. Files are great for storing data but as a
means for secure integration, they are not the best alternative.
Consider the following when using files as transport in integrations:
Files are easy to apprehend; everyone understands the concept of a file.
The tools to manipulate files (for viewing, deleting, appending, etc) are known by
everyone and built into the operating system.
Files are NOT transactional, that is; there is no standard way to ensure that ‘all or
nothing’ is transferred. A common error in all file-based integration is that if a file
transfer is disrupted, a fraction of the file is transferred and the rest is not. In order
assure transactionality when using files, additional mechanisms must be added on
top, i.e. create a proprietary protocol. This makes it complicated and thereby
expensive and error prone if good transactionality is required. A possible solution
would be to add control sums to files manually or by means of existing solutions for
example in archiving programs (gzip).
Files names are the only way to separate one file from another. This makes file
naming sensitive and one must be extra careful to ensure unique file names. A
common mistake is to use hours, minutes and seconds of the time stamp as part of
the file name. If more than one file is received in the same second, files names will
not be unique and there will be a collision error.
File names should ensure the unique identification of the file; therefore using the
time stamp should be avoided. Use sequence number or a randomized GUID
(globally unique identifier) if possible.
As a result of the facts above, it is not uncommon to add functionality to the basic
file transport to ensure transactionality and the unique identity of messages being
conveyed. This is close to designing your own proprietary protocols, which will
certainly drive up development costs. If transactionality and unique identity are
important characteristics, file transport should be avoided.
SFTP could be a viable solution to comply with the security issue.
Now we defined the important components of an integration flow that involves one
or more SAP systems and we know more about the integration requirements. In
order to design an accurate solution and plan the implementation of the integration
solutions, we need to:
1. Define the connectivity possibilities for the involved applications
2. Define the integration requirements
3. Define the solution or the alternative solutions
4. Estimate the required time to implement the solution
5. Create a budget
6. Plan the implementation
4.1.1 Connectivity to applications
Another important parameter that should be taken into consideration is the
connectivity possibilities of the application that needs to be integrated with SAP as
well as the connectivity possibilities of the SAP application.
The following questions should be asked by the project to the system manager of the
applications to be integrated with SAP and to the system manager of the SAP
• How can data be accessed in the application(s)?
• Are there any proprietary connectivity mechanisms, such as APIs, SAP IDocs,
• What is the geographical location of the application hardware?
• Are there any other considerations that must be handled regarding firewalls
and networking, like applications being placed outside the organization’s
internal network, behind its own firewall or in a DMZ, etc. (Note this must be
known for all environments, not only for production instances of the
The above questionnaire defines the general characteristics of the application that
needs to be integrated with SAP.
220.127.116.11 How SAP communicates/interfaces with other systems?
As we mentioned previously, in order to support business processes and data
sharing across applications, applications need to be integrated. Application
integration needs to provide efficient, reliable, and secure data exchange between
multiple enterprise applications .
Applications need to send and receive information from other applications to
complete their process. The information could be shared by:
• Sending/receiving files
• Sharing databases; this type of sharing of information is not included in the
scope of this document
• Remote procedure invocation
As we mentioned earlier, SAP uses BAPI/RFC, IDocs, ALE, WebServices with Reliable
Messaging, WebServices without Reliable Messaging and Files to receive and send
messages from/to different destinations.
The diagram in the following figure, explains how the interfacing mechanisms in SAP
is working. The diagram lists the main adapters that are available in the SAP
environment. Some of these adapters are available via the SAP R/3 and mySAP ERP
and other adapters are available via the SAP NetWeaver Application Server and SAP
NetWeaver Process Integration. There are of course many other standard and third
party adapters that are not listed in the following figure. For example, from the
company SEEBURGER all adapters that are covered by the reseller agreement
between SAP and SEEBURGER are already certified for SAP NetWeaver PI 7.1; to be
more specific, the following SEEBURGER adapters for SAP NetWeaver PI 7.1 that are
covered by the reseller agreement are available: (see, )
Technical EDI Adapters
OFTP (OFTP / ISDN, OFTP / TCPIP)
VAN Access (P7 / X.400, VAN FTP)
We will not define or use these adapters, because they are not included in the scope
of the guidelines.
SAP R/3, mySAP ERP
Figure 10: SAP Interface architecture
4.1.2 The decision matrix
As we mentioned in the beginning of this document, the alternatives for connecting
two or more system that have different possibilities to be accessed could be more
than one. If we analyze the following figure we see that in the left side we have the
requirements and in the right side we have the interfaces. Our assignment is to
select the right interface for the agreed requirements.
Common Quality Attributes
Figure 11: Integration Requirements vs Interfaces
If we assume that we can use asynchronous communication for our messages, then
the alternative interfaces will be:
Figure 12: Many Alternative Interfaces for a Requirement
The following table describes the different SAP interfaces and the corresponding
requirements that they implement. It should be helpful when making the decision
about which interface is most suitable and meets the requirements:
Table 3: Decision Matrix
Async. Com.: Asynchronous communication;
No = not allowed;
Yes = shall be used;
Sync. Com.: Synchronous communication;
A = Applicable: There are possibilities to fulfill the requirement (depending on the integration scenario and on the provided/available
N/A = Not Applicable (the interface has not the technical capabilities to fulfill the requirements);
If there are no requirements on Transactionality; Transactionality: In this scope we define Transactionality as ‘the ability to ensure that a message is conveyed in its entirety’. The whole message or nothing at
all is sent.
By using a program
If there are no requirements on Transactionality
By using a program
Interface Sync. Com.
Reusable Large amount of
mechanism Transactionality Security
IDoc (Inbound &
Outbound) N/A Yes Yes Yes Yes Yes Yes Yes
ALE N/A Yes Yes Yes Yes Yes Yes Yes
RFC/BAPI Yes A Yes Yes No N/A Yes Yes
Services with RM Yes A
No N/A Yes Yes2
Services without RM Yes A
No N/A No Yes3
File (FTP) N/A Yes4
No No Yes Yes5
SFTP N/A Yes 6
No No Yes Yes7
4.1.3 Determining which interface method to use when interfacing to SAP
Depending on the requirements on the SAP interfaces (e.g. Guaranteed Delivery,
Transactional messages , the reliability and performance of the interface), there
are several ways to communicate with SAP (see the table) and there are at least
two cases to handle:
Case 1: If the requirements include Guaranteed Delivery, Transactional
messages, event driven messaging:
The projects have to use suitable and transactional/reliable SAP interface
technologies like: BAPI/RFC, IDocs, ALE, WebServices with Reliable
Messaging (if the SAP version provides WebServices or supports the
services paradigm) to receive and send messages from/to different
Case 2: If the risk analysis assesses that the consequences of losing messages are
not serious for the business (there areno needs for Guaranteed Delivery,
• It is permissible to use File integration (P2P non-transactional
integration). SFTP is a viable solution in order to comply with the
• It is permissible to use WebServices (without Reliable Messaging).
18.104.22.168 Integration Scenario
To create a total list of integration scenarios that are possible when integrating
applications with many adapters like SAP, requires great effort. The list depends
• The requirements on the interfaces
• SAP to SAP scenarios
• SAP to non-SAP scenarios
• Internal/External applications that want to communicate with SAP
• The available adapters that exists in the sending/receiving applications
• As mentioned earlier, the availability of an EAI tool in the organization
(e.g. BizTalk Server, SAP PI/XI and the existence of SAP adapters to handle
IDocs, BAPI, RFC messages).
• The volume of the messages is a very important parameter in defining a
reliable interface. You need to consider an interface that is using a BAPI is
not suitable for large messages. It is wise to run performance tests for the
The figure below lists some of the common scenarios:
1. Scenario 1: an application has the possibility to connect to a WebService
to receive information by http/https protocol.
2. Scenario 2: an application has the possibility to send IDocs to SAP.
3. Scenario 3: an application has the possibility to receive IDocs from SAP.
4. Scenario 4: an application has the possibility to connect to a BAPI/RFC.
5. Scenario 5: an application has the possibility to receive a File from SAP.
6. Scenario 6: an application has the possibility to send a File to SAP.
7. Scenario 7: an application has the possibility to receive ALE from SAP.
8. Scenario 8: an application has the possibility to send ALE to SAP.
SAP R/3, mySAP ERP
Figure 13: Integration scenarios
4.1.4 Estimate the solution
As main principle, the estimation, requirements specification (for the integration
part) and the design of the integration solution will be the responsibility of the
If we assume that the enterprise invested in an EAIP (an EAIP is available for the
enterprise). And if we assume that the organization has an integration team or
(integration competence center), then it is natural that estimation of the required
efforts for building the integration solution and the required integration flows will be
provided by the experts from the integration team.
If assume that the enterprise uses the most popular application integration tool
(BizTalk Server 20XX), the BizTalk Architects and the BizTalk developers needs to
provide a diagram for each integration flow. The following diagram is taken from a
Microsoft page . The diagram illustrates the flow of a message through the
The main steps in the message flow are:
1. A message is received through a receive port which contains: Receive
Adapter, Receive Pipeline and Inbound Map. The received massage from the
sending system will be handled by a pre-configured adapter. BizTalk has
many adapters (e.g. WCF-SAP, HTTP, FTP, File, etc).
2. The receive pipeline processes the message and perform different operations
(e.g. decoding, disassembling, validating, etc).
3. The receive ports may transform a message to an internal message format
defined and deployed in the BizTalk environment.
4. The message is inserted into the MessageBox which is based on a SQL Server
database. And according to the publish-subscribe pattern the subscribers
(e.g. Orchestration or Send Ports) will receive a copy of the right message
that the subscriber is listening to.
5. If the subscriber is an orchestration then the orchestration will process the
message and executes the required workflow and sends the message back to
6. If the subscriber is a send port, then the message can be transformed into a
required output format.
7. Before the message will be send to the destination, the send pipeline
perform some operations like pre-Assembling, assembling and encoding.
8. The send port uses the pre-configured adapter to transmit the message to
the destination system (e.g. SAP).
The above steps and the requirements part of an integration solution will define the
work breakdown structure (WBS) for an integration flow, see the next table.
Figure 14: The path of a message in BizTalk Server
The work breakdown structure (WBS) for an integration flow is divided into different
sections. The main sections are:
• Business Specification
• Sending Application
• Receiving Application
• Information Flow A
• General Activities
Business Specification Comments
The integration team may request if there are any Process
descriptions available for the project.
Describe the Integration Scenarios: The type of integration tool
to be decided (integration platform or point-to-point).
Specification of use cases
Analyse the provided use cases from the project and check the
impacts of the use cases on the Integration solution. Create a
use case diagram (for the integration requirements if needed).
Specification of the flow characteristics
The SLA parameters for the integration flow (or a group of
integration flows) need to be agreed with the business/project.
Repeated for each sending application that is included in the
General knowledge of application
Understanding the main capabilities provided by the application.
Analyse the application architecture and the underlying
technology and building blocks of the application.
Provide documentation and contact
Find the contact persons (system owner, system manager) for
Connectivity at Sending Application
Clarify the options available
Selection mechanism for connecting Includes choice of standard adapters, custom designed adapters
Design or configuration and testing of the
Setting up test environment for sending
application Server Configurations
Repeated for any receiver application that is included in the
General knowledge of application See above for Sending Application
Provide documentation and contact See above for Sending Application
Connectivity at Receiving Application See above for Sending Application
Clarify the options available See above for Sending Application
Selection mechanism for connecting See above for Sending Application
Environment See above for Sending Application
Setting up test environment for receiver
application See above for Sending Application
Information Flow A Repeated for each information flow
Writing specification Data Mapping Specification document
Workshops, meeting with users
Design of the mapping
Testing the mapping (unit test)
Integration Logic Realized often through orchestration in BizTalk (if needed)
Design Design of the orchestration
Testing the integration logic (unit test)
Specification of test data Constructing test data
Generate test data Providing test data
Configuration of test generators
Deploy to Acceptance Test environment
Deploy to Production environment
Activation of service monitoring Supervision of queues, info flow measurement
Handover of Operation
Update on the operation documentation Operating instructions, interface contracts
Archiving of project documentation
Approval of deliverables
Table 4: WBS for an Information Flow
4.1.5 Create a budget
The budget will be based on the estimation of all required information flow from the
project. As a project manager and with agreement with the integration Architect and
the integration developers you may add 5-10% of the total time of the estimated
hours/days for the integration flows. You may use that extra time in case something
4.1.6 Plan the implementation
In my experience, the best way to define and agree about the integration
requirements is to conduct workshops with the stakeholders. It is very important to
prepare to these workshops (e.g. reading the available documents for the project,
sending questions to different stakeholders, understanding the main concerns and
the ambition levels of the stakeholders, preparing the main alternatives “if
As an integration architect you need to involve the developers in an early stage. This
is important to make the developer engaged and to see the solution from a
developments point of view. As an integration architect you may miss some hidden
sides or did not analyze the complexity of the integration flows.
Planning the implementation is the task of the project manager and the team
manager (line manager of the integration team).
4.2.1 A Case Study (how to use this document)
In general, it is a big challenge to determine a right interface. The integration
architects and the developers from the application teams should have enough
knowledge not only about their applications but also regarding the interfacing
Today, the best way to get an integration project successfully implemented is to
use the expertise of an experienced team who did the job in the past or has the
competence and the theoretical basis to do the design and implement an
The Case Study:
Imagine that a Work Order Management System (WOMS) needs to be updated
with different information from different systems. For simplicity we will
concentrate on the required information from HR (human resources). The data is
in the SAP ERP system. The WOMS need to be updated when a new employee in
the Service Organisation is hired. You as an integration architect/expert will be
involved the Work Order Management System project.
1. Before analyzing the requirements, we need to identify the stakeholders of
the integration requirements/the integration solution. As an integration
architect/expert you may start with the project manager. With help of the
project manager, you may start to identify the main stakeholders: (e.g. the
HR process/information owner, HR administrator, the WOM
process/information owner, developers, system manager of the involved
systems, other the stakeholders “WOMS Dispatcher, the field worker”).
2. As we mentioned previously, when the stakeholders are identified, the
project should begin with gathering and analysing the requirements on the
information flow(s). We need to discuss and send the questions to our
stakeholders, see table 2; relevant questions that should be asked to the
integration solution stakeholders. Our assumptions is: The information about
the new employee should be available as soon as possible to the WOMS in
order to assign new orders to the employee “the field worker”. The
information should be sent/fetched directly (event driven pattern) and the
information flow requires Guaranteed Delivery. HR data by the nature is
sensitive and need to be securely transmitted between applications.
3. When the requirements are defined the available interfaces need to be
analysed, then the project selects the right interface from the decision
matrix, by matching the requirements to the right interface.
a. The analysis for the connectivity possibilities for the application
“WOMS” that needs to be integrated with SAP shows that the
i. WebService Client/ WebService: Can connect to other systems
with a WebService Client or provides WebService to exchange
information by http/https protocol.
ii. Can connect and be connected by File interface
b. The analysis for the connectivity possibilities for the SAP application
shows that the application:
i. Has a standard IDoc and standard BAPI for processing HR data.
ii. Can connect and be connected by File interface
4. When we map the requirements to the available ways to communicate
between these two applications, you see (according to the decision matrix)
that in principle, the project has four alternatives for integration with SAP
and one of these alternatives is not recommended (using files when the
project requires Guaranteed Delivery for the message coming from SAP ERP).
a. Alternative 1: The first alternative is based on the possibility that the
SAP system can connect to a WebServices. The SAP system can access
the WOMS with WebServices client over the https protocol. An
asynchronous WebServices client/WebServices intraction with
Reliable Messaging is the best alternative here. The advantage with
this solution is that the SAP system can send the information about
the new employee as soon the information is available in the SAP
Figure 15: The SAP system can connect to a WebServices
b. Alternative 2: The second alternative is based on the possibility that
the SAP system has with providing WebServices. The WOMS can
access the SAP system with WebServices client over the https
protocol. An asynchronous or synchronous WebServices
client/WebServices interaction with Reliable Messaging could be a
good alternative here. The disadvantage with this solution is that
WOMS need to ask/request data from the SAP system without
knowing that a new employee is hired or not. You need to schedule
this request at an optimal frequent in order to minimize the load and
at the same time detect a new employee as soon as possible.
Figure 16: The SAP system can be connected with a WebServices
c. Alternative 3: The alternative is based on exchanging information via
files. The SAP system sends a file every week or day or when a new
employee is arrived. This alternative is not recommended, but it could
be used if you do not have the possibilities in alternative 1, 2 and 4,
for more information about why This alternative is not recommended,
see the sections about Files/FTP.
Figure 17: Exchanging information via files
d. As we mentioned in the beginning, we assume that the enterprise has
invested previously in an Integration platform (as we did in 2004 and
our platform BizTalk Server 2013 Platform now is one of the biggest
platforms in Europe). One of the biggest advantages with an EAIP
(Enterprise Application Integration platform) is the out of the box
features (the availability of the integration capabilities). Systems like
SAP, WOMS, CRM, SRM, etc, do not need to implement these
capabilities. This makes the live easy for these systems. Systems that
need to be connected do not need to be more complex than they are.
Adapting these systems to receive and send messages to different
systems with different formats, protocols and
connecting/implementing different adapters, takes a lot of efforts and
makes the maintainability for these systems a time consuming task.
The integration platform among other integration capabilities,
receives, sends and routes the messages from the receiving and to the
sending systems. An integration platform with many out of the box
features (e.g. different adapters) has the ability to be configured and
adapted to the interoperability possibilities of the sending and
If we add the integration platform to the figure number 13:
Integration scenarios, the scenarios will changed, see the following
SAP R/3, mySAP ERP
Figure 18: Integration Scenarios with involvement of an Integration Platform
The disadvantage with involving an EAIP is the performance (the delay
when a message will pass another system “the EAIP” (adapters,
database, etc) before receiving the message in the destination
system, see figure 14: The path of a message in BizTalk Server.
Another disadvantage is that a project needs to deal and work with an
additional team (the integration team) when the project needs to
define the requirements, design the solution and implement, test,
handover the outcomes. This could take time especially if the
integration team is loaded with many simultaneous assignments. An
integration solution based on involvement of an EAIP (in our case
BizTalk Server 2013) could look like the following picture, see figure
The project should avoid a point-to-point solution without
Transactionality and Guaranteed Delivery, because of the
requirements. In this alternative we think that the right interface
should be IDocs, because an IDoc is suitable for event-driven and
transactional messages, see the decision matrix. When a new
employee will be recruited to the company and he/she need to be
added to the organisation that works with Work Orders, a HR
administrator uses the SAP system to insert this new employee into
the ERP system. A SAP ABAP programmer configures and allows that
an IDoc will be trigged when changes accrue in HR administration (in
our case new employee arrived in the organisation X). The IDoc will be
received in the BizTalk platform by the WCF SAP adapter. The
message will be mapped to the Work Order Management System’s
WSDL WebService and will be sent to WOMS over the https protocol.
Figure 19: Integration Flow with involvement of an Integration
5. Check and Act
The scope of this document is not to implement the interface, test it and acting if we
find errors. We may need to monitor the requirements and check the impacts of
changed requirements and new requirements in the integration solution.
As we mentioned in table 4: WBS for an Information Flow, modifications need to be
identified analysed and estimated.
1 Requirement Specification for Application integration, Åke Svilling, Alaa Karam
2 Enterprise Integration Patterns, Gregor Hohpe,…, ISBN 9780321200686
8 The position of SAP (Tino Scholman)
13 SAP for Retail, Heike Rawe, ISBN 978-1-159229-213-4