OCI DR in the Cloud

Business Continuance via Disaster Recovery is an essential element of IT and takes on many forms. The high end consists of high availability solutions that provide real-time replication of systems. While these systems provide seamless continuity during outages they are large, complex, and expensive, justifiable to support only the most critical business applications. At the other end of the continuum, however, Disaster Recovery is little more than tape backup or backup to NAS which have complicated and lengthy restore procedures which take hours or days.
A major improvement can be made in disaster recovery with a solution that provides business continuity in a model that simply extends the existing IT architecture into the Cloud.

Rackware RMM Migration/DR platform is a non-intrusive Agentless Technology with pre- and post- Migration Configuration Capabilities that is easy to set up and configure for complicated enterprise environments/applications. Rackware RMM supports both Linux and Windows-based workloads for migration to the Oracle Cloud Infrastructure.

RackWare RMM platform provides a flexible and all-encompassing solution for Migration and disaster recovery. RackWare helps Enterprises and large Organizations take advantage of the agility promised by Oracle Cloud Infrastructure. Rackware’s platform eliminates the complexity of protecting, moving, and managing large-scale applications, including critical business applications and their workloads into the Oracle Cloud. It is now possible for enterprise customers to forgo the upfront purchase of duplicate recovery hardware, the cost of set up, configuring, and maintaining that hardware by leveraging Oracle cloud infrastructure.

Rackware RMM provides the following value proposition for enterprises in the Oracle Cloud:

  • Non-disruptive / Live Captures -No agents installed, safe and secure replication of your production environments
  • Network and Application Discovery – Automatically discover network configurations and applications allowing you to reconfigure them in the OCI environment during migration
  • Universal DR Protection – RackWare support spans all physical and virtual confluences, even for complex environments with Large SQL Clusters, and Network Attached Storage
  • Seamless Failback –  To physical and virtual environments, for simple disaster recovery drills
  • Cost Reduction – Orchestration engine for multiple polices of RPOs and RTOs based on tolerance to reduce costs with less expensive compute, network, and storage utilization.

Storage Methods

There are 2 storage methods available for Disaster Recovery.

Store and Forward

Store and Forward will create an image of your source workload in storage on the RMM’s database. When using this method, the RMM will need a datastore capable of containing the amount of used data from each source hosts minus typical compression savings.

Store and Forward is required if using the auto-provision feature whereby the RMM will only provision the compute resources during a DR event or test/drill event or to offer the multi-stage protection of having data protected by a stored image and then synced from stored image to target compute resources.

Passthrough

RMM does not store a copy of the used data from source hosts. The RMM acts as a passthrough proxy to sync the source workload data through itself and onto the target DR instances.

How it works

RMM provides a DR solution that builds on its image mobility and elasticity features to bring economic DR to enterprises. The building blocks of RackWare’s DR solution include onboarding, cloud bursting and the policy framework to automate necessary functions. Captured images from production (origin) instances are cloned and pushed out to a local or remote DR site. Changes in production images are periodically synchronized with the remote images, keeping the original host Image and the DR image in sync. In the event of an outage at the origin site, the up to date image at the DR site can assume operations through RackWare’s fail-over mechanism.

After the production instance is repaired and operational, it’s easy to restore the origin site to any up any changes made to the CloudImage in the cloud. When the origin site is restored to its operational state, the administrator can utilize the capture from cloud feature to refresh the original Image and synchronize any changes that occurred during the outage.

Overhead on the origin Host is extremely small involving only resources to take a delta snapshot. Thus the data overhead of the WAN link incurs only the delta of information, keeping bandwidth needs and sync time to a minimum. It’s important that Image updates include user data, Operating System updates, and application installations and configuration changes so that the recovery image behaves exactly like the production image should a failover occur. The cloud DR feature supports all of these. While OS updates are more infrequent it is still important to ensure that kernel patches are kept in sync with the DR Image. When updating the OS, an image refresh operation is done from the RMM first before the sync to the CloudImage. Should the production system be compromised or inoperable, the CloudImage is automatically launched and is running with the latest synchronized changes.

Oracle & Rackware partnership provides a seamless experience to Migrate to the Oracle Cloud Infrastructure and secure customer workloads with dynamic provisioning and disaster recovery.

About TekStream
TekStream accelerates clients’ digital transformation by navigating complex technology environments with a combination of technical expertise and staffing solutions. We guide clients’ decisions, quickly implement the right technologies with the right people, and keep them running for sustainable growth. Our battle-tested processes and methodology help companies with legacy systems get to the cloud faster, so they can be agile, reduce costs, and improve operational efficiencies. And with 100s of deployments under our belt, we can guarantee on-time and on-budget project delivery. That’s why 97% of clients are repeat customers. For more information visit https://www.tekstream.com/

Integrating Oracle Cloud ERP (Cloud Fusion) With External Applications Leveraging Oracle BI Reports

By: Greg Moler | Director of Imaging Solutions

 

Does your organization utilize Oracle Fusion Applications such as Oracle Cloud ERP, Cloud Human Capital Management, and Project Portfolio Management cloud?  Are you looking for ways to extend the functionality of these applications?  These platforms store a lot of critical business data, much of which has the potential to be integrated with other third-party applications.  In this article, we explore use cases and considerations for leveraging Oracle BI reports to integrate Oracle Fusion data with external applications.

 

What are Oracle BI reports?

Oracle’s Fusion Applications including Cloud ERP, Cloud Human Capital Management, and Project Portfolio Management cloud as well as many others, use the Oracle Business Intelligence (BI) platform to provide reports and analytics.  BI reports provide a way to query and report against the underlying data in these platforms.  In the cloud platform, these take the place of traditional database queries.  They can be used in a variety of different functions to extract data for use with external applications.

 

Use Cases

When it comes to potential use cases for extending your Oracle Fusion application’s data, the possibilities are endless.  If you can think of a need, we can design a way to build it.  In this section, we take a glimpse at some specific use cases for integrating data from Oracle Cloud ERP.

  • Automating GL/Project Account Coding: Manually coding invoices can be time-consuming for AP personnel. This process can be streamlined by automating business logic to programmatically apply GL and Project codes to invoices.  For example, often one attribute on the invoices drives the rest of the coding.  Another common scenario is vendor distribution sets, where each vendor has a pre-defined set of charge account strings and percentages that get applied to each invoice.  BI reports can be structured to take input from your 3rd party application such as vendor id and return coding data.
  • Workflow routing and Approval hierarchy: If your application does workflow routing or approval hierarchy, it will need to know which user’s documents should be assigned to. Often, this is based on data on the document and can be retrieved from Cloud ERP.  BI reports can be designed to return the appropriate workflow assignee based on the pertinent ERP data.  If your organization also utilizes Oracle Human Capital Management (HCM), you can extend functionality even further by directly accessing employee data in the approval hierarchy.
  • Vendor data: Vendor data stored in Oracle Cloud ERP can be incredibly useful in a variety of different scenarios. For example, vendor identification on invoices, purchase orders, and other documents.  Another common use case is the use of vendor-specific attributes or descriptive flex fields (dffs) defined in Cloud ERP.  These attributes can be easily maintained by the business in the Fusion interface and leveraged in an external application to drive automation such as account coding or workflow routing as described previously.
  • Validation data: More than likely, your application will need to be able to validate that the data on the document or object is correct before allowing it to proceed. The information needed to perform this validation is typically stored in Cloud ERP.  For example, validating whether a purchase order had enough open balance available for a line item.  BI reports provide a way to perform these validations, simply by taking input parameters from the external application and returning corresponding data from cloud Fusion.
  • Reporting data: Reporting options within Fusion Applications and Oracle Business Intelligence platform can be limited. These reporting capabilities can be supplemented by an external reporting tool.  BI reports provide a way to extract data out of Oracle for analysis and reporting in an outside application.

 

 Considerations when creating Oracle BI Reports:

When designing and creating Oracle BI reports for integration with an external application, there are many considerations including:

  • Data Structure: Design a reusable format for your BI reports so that data is returned in a consistent structure. This will simplify integration points with the reports.
  • Data return type: What return data type will your application require? XML? CSV?
  • Catalog folder structure: Consider how to organize your catalog folder structure. Which reports function together and should be grouped together?  Which reports should be promoted together between environments?  The easiest way to migrate reports is to use the Archive/Unarchive function on an entire folder.  It is important to consider which reports should be archived together.
  • Testing options: The most basic way to test the BI reports is through the Fusion Catalog Manger. This provides a good baseline test.  More than likely, the reports will be called via web service, so the next layer of testing should be done using a tool such as SOAP UI that will allow you to test calling web services.  This will allow you to call the web service and view responses as they will be received by the external application.  Keep in mind that all responses must be decoded before they can be consumed.

 

Interested in getting more out of your Cloud ERP data?  Contact TekStream to learn about how we can assist with your Cloud Fusion integration needs!

3 Ways to Migrate Custom Oracle Middleware Applications to the Cloud

Understanding and classifying middleware applications is one of the most critical and complex tasks of any Cloud adoption process. No doubt, your company has several diverse applications integrated into your system. Off-the-shelf applications for sure, but custom-built and legacy applications as well.

Whether you are considering migrating your Oracle solution to Amazon Web Services (AWS), Oracle Cloud Infrastructure (OCI), or another Cloud platform, each of your legacy applications will have its own migration needs that will need to be accounted for during your migration efforts.

3 Methods for Migrating Your Middleware Applications to a Cloud Environment

Re-Hosting: Lift & Shift Migrations

The first method for migrating your middleware applications to the Cloud pertains to your applications that rely on traditional server/compute technologies or applications that, based on their complexity, won’t benefit from re-factoring to utilize newer technologies like serverless or micro-services.

For these applications, we recommend leveraging the Infrastructure as a Service (IaaS) offerings provided by AWS and OCI (depending on your preferred platform). With these IaaS offerings, you can re-create the compute/servers required to run these applications just like you would with a traditional datacenter. You can also layer on new Cloud tools and concepts like:

  •  – On-demand pricing
  •  – Next-generation networking and securities
  •  – Additional service integrations like Content Delivery Networks or API gateways

As a note, many Oracle middleware applications will potentially fall under this category. Most of the WebLogic off-the-shelf applications use stateful sessions for clustering and will require additional effort to be able to integrate with newer Cloud concepts like auto-scaling.

Re-Platform: Migration Applications to a Managed Platform

For this next method, you’re going to focus on applications that can (and should) be moved to a managed platform. AWS has several services available to support the deployment of custom applications that use various tech stacks (Java, PHP, .Net, etc.).

In these instances, AWS takes over the provisioning, management, and autoscaling of compute/service and network compliances. This can significantly reduce operational costs as companies no longer need to maintain servers, operating systems, networks, etc. It also eases the migration tasks by removing infrastructure components from the mix.

Re-Architect: Recreating an Application for the Cloud

While many applications can be migrated via a “Lift-and-Shift” approach or through a managed platform, others may need to be completely overhauled to function correctly in the Cloud. “Re-thinking” or “re-architecting” these types of applications allows your team to ensure that these tools and concepts can utilize their full potential and appreciate the benefits of being deployed on the Cloud.

For example, you can explore opportunities to break down monolithic apps into smaller “micro” services and utilize serverless technologies like Lambdas, Amazon Simple Notification Services (SNS), or Amazon Simple Queue Service (SQS) to improve performance. At the same time, you can replace the traditional Oracle RDBMS data sources with new concepts like Data Lakes, Object Storage, or NoSQL Databases.

The Migration Support You Need

Regardless of your Cloud platform of choice, careful consideration needs to be given for how you are going to migrate your legacy middleware applications. You can also use your upcoming migration as an opportunity to audit your applications and determine if there are any that can be sunset or rolled into a new system or application to drive further efficiency.

Have questions? TekStream has deep experience deploying enterprise-grade Oracle middleware applications both on traditional data centers as well as cloud environments like AWS or OCI. As part of any migration, we utilize that experience to help classify applications and apply best practices to the deployment of those applications in the Cloud.

Are you looking for more insight, tips, and tactics for how best to migrate your legacy Oracle solution to the Cloud? Download our free eBook, “Taking Oracle to the Cloud: Key Considerations and Benefits for Migrating Your Enterprise Oracle Database to the Cloud.

 

If you’d like to talk to someone from our team, fill out the form below.

How to Leverage a Bring-Your-Own-License Model on Oracle Cloud Infrastructure and Amazon Web Services

It’s no secret that Oracle licensing can be complicated. Between the never-ending legal jargon, Core calculations, and usage analysis, Oracle licensing can get complex. Often, it’s navigating these licenses, not the underlying technology, that can halt even the most well-intentioned Oracle Cloud migration efforts.

In this blog post, we’re going to take a closer look at how you can leverage your existing Oracle license to support your Oracle Cloud migration efforts to either Oracle Cloud Infrastructure (OCI) or Amazon Web Services (AWS).

What is a Bring Your Own License Model, Anyway?

Simply put, Bring Your Own License (BYOL) is a licensing model that lets you utilize your current on-premise Oracle license to support your Oracle migration and deployment to the Cloud – oftentimes at a significant cost savings.

BYOL on OCI

Is your organization leaning toward migrating your legacy Oracle system to OCI? If you have any existing Oracle software licenses for services like Oracle Database, Oracle Middleware, or Oracle Business Intelligence, you can leverage those existing licenses when subscribing to Oracle Platform Cloud Services (Oracle PaaS).

With BYOL, you can leverage existing software licenses for Oracle PaaS subscriptions at a lower cost. As an example, if you already have a perpetual license for Oracle Database Standard Edition, then you can leverage that license to purchase a cloud subscription to Standard Edition Database as a Service at a lower cost.

The total cost of ownership calculations can be complex with this option as you need to consider your existing cost of support, the added value you will gain from a cloud-based, self-healing, self-patching solution versus the cost of buying the solution outright without using BYOL. TekStream can help you weigh these options if you are thinking about leveraging BYOL for your cloud journey.

How Do You Use Your BYOL for Oracle PaaS?

So, how exactly do you use your existing Oracle software license to support your OCI migration needs? It’s easier than you may think:

• Select specific Oracle BYOL options in the Cost Estimator to get your BYOL pricing.

Apply your BYOL pricing to individual cloud service instances when creating a new instance of your PaaS service. BYOL is the default licensing option during instance creation for all services that support it.

As noted, when creating a new instance of Oracle Database Cloud Service using the QuickStart wizard, the BYOL option is automatically applied.

Bring Your Own License to AWS

Oracle can be deployed on AWS using the compute resources (EC2). Like a standard server on your datacenter today, when using this migration strategy, you are responsible for the licenses of any software running on the instances (including Oracle database, middleware, or any other software instances).

You can use your existing Oracle licenses to run on AWS. If you choose this licensing approach, it is important to consider a couple of supporting factors.

If you are licensing a product by processor or named users on this platform, you need to consider the Oracle core multipliers referenced in the terms and conditions of your license agreement.

If you are using employee or user-based metrics, you can deploy solutions on AWS with little concern about these issues.

Many Oracle Unlimited and Enterprise License Agreements do not allow usage in AWS. If you are using one of these options for your Oracle licensing, we would recommend reviewing your contracts carefully before deploying these Oracle licenses on AWS.

Is the BYOL Licensing Model Right for You?

Regardless of which Cloud platform you choose (AWS or OCI), a Cloud migration is the perfect opportunity to reexamine your Oracle license structure. Whether you opt for the BYOL licensing model or choose to utilize a new licensing structure, take this opportunity to identify ways to reduce the cost of your overarching licensing structure.

Learn about alternative licensing models by downloading our free eBook, “A Primer on Licensing Options, Issues, and Strategies for Running Oracle CPU-based Licenses on Cloud.”

Need help? TekStream can help demystify the Oracle licensing process. We provide straightforward counsel and, most importantly, identify cost-saving opportunities while still maintaining full licensing compliance.

 

If you’d like to talk to someone from our team, fill out the form below.

Migrating Your Enterprise-Level Oracle Solution to the Cloud? Key Benefits and Drawbacks of Amazon Web Services and Oracle Cloud Infrastructure

There are a plethora of cloud platforms available to Enterprise-level companies that are exploring options for migrating their current Oracle solution to a cloud environment. While we won’t name them all, a typical shortlist is going to include platforms familiar to us all: Google Cloud Platform, Microsoft Azure, Amazon Web Services, and Oracle’s own Oracle Cloud Infrastructure.

In this blog post, we’re going to break down some of the key benefits and drawbacks of migrating your Oracle solution to two of the giants in the industry – Oracle Cloud Infrastructure (OCI) and Amazon Web Services (AWS).

Key Benefits and Drawbacks of OCI

Being a cloud-based platform, migrating to OCI also includes several essential benefits common to cloud environments, including:

  •  – More streamlined performance
  •  – Automatic software updates
  •  – Scalability
  •  – Disaster Recovery

Oracle utilizes some of the most advanced technologies to deploy its fully autonomous and scalable Autonomous Data Warehouse and Autonomous Transaction Processing for data warehousing and OLTP workloads, respectively. These technologies support the more advanced Oracle database features such as RAC, Data Guard, Redaction, Encryption, etc.

Another core benefit of choosing OCI as your Cloud platform of choice; the traditional data maintenance/migration utilities like Golden Gate, Data Guard, RMAN, etc. are supported on the Database as a Service offering.

So, how does it differ from AWS? A key differentiator from AWS, the autonomous and advanced features of Oracle databases, is only available on OCI.

It can’t be all benefits though; OCI also has its specific drawbacks. Chiefly, OCI services tend to have a high licensing cost, which can make OCI cost-prohibitive for small to medium workloads – and by extension – small and medium businesses.

Also, OCI’s lack of a live chat feature with skilled support personnel can mean a frustrating troubleshooting experience for companies making the migration to OCI.

Key Benefits and Drawbacks of AWS

As a longstanding leader in the Cloud technology space, AWS has built a strong reputation as a trusted cloud-partner for thousands of Enterprise companies. Plus, they have one of the most robust cloud-based offerings on the market through their AWS ecosystem.

When it comes to supporting Oracle on a cloud environment, AWS has integrated Oracle databases as part of its main Relational Database Service (Amazon RDS) offering. Amazon RDS is provided as part of the managed service and includes a reasonably comprehensive list of features that complement the base functionality of Oracle.

These features include:

  •  – Additional monitoring and metrics
  •  – Managed deployments for software patching and push-button scaling
  •  – Automated backup

AWS also provides an opportunity for companies to review their Enterprise Edition license, as it delivers similar technologies to Oracle’s Tuning and Diagnostic Packs as part of the base license.

So, what are the drawbacks of using AWS to support your Oracle Cloud migration? The most critical disadvantage of AWS is that it has the potential to be difficult and expensive to run some of the more robust oracle features found in Oracle Enterprise, including Data Guard, Management Packs, and Advanced Security. Something to keep in mind if you are using these additional features.

AWS or OCI, Which Is Right for You?

There is no single right answer. Both platforms have their advantages and their drawbacks when it comes to supporting your business’s cloud-based Oracle needs. The “right” platform will be the one that best supports your specific business criteria.

If at any time you have questions concerning your specific cloud migration needs, please reach out to TekStream. Our team of Oracle experts has years of proven experience navigating the cloud-migration needs of our partners.

We also encourage you to download our eBook, “Taking Oracle to the Cloud: Key Considerations and Benefits for Migrating Your Enterprise Oracle Database to the Cloud” for even more information on how best to approach an Oracle cloud migration.

 

If you’d like to talk to someone from our team, fill out the form below.

WFR(ee) Things A Customer Can Do To Improve Extraction

By: William Phelps | Senior Technical Architect

 

When using Oracle Forms Recognition(“OFR”)  or WebCenter Forms Recognition(“WFR”) with the Oracle Solution Accelerator or Inspyrus, clients often engage consulting companies (like TekStream) to fine-tune extraction of invoice data.  Depending on the desired data to extract from the invoice, the terms “confidence”, “training”, and “scripting” are often used in discussing and designing the solution.  While these techniques justifiably have their place, it may be overkill in many situations.

Chances are, if you are reading this article, that you may already be using WFR.  However, the extraction isn’t as good as desired.  You may have been using it for quite a while, with less-than-optimal results.

In reality, there are several no-cost options that a customer can (and should) perform before considering ANY changes to a WFR project file or attempting to bring in consulting.  This approach is the “don’t step over a dollar to pick up a dime” approach.  Many seemingly impossible extraction issues are truly and purely data-related, and in all likelihood, these basic steps are going to be needed anyway as part of any solution.  There is a much greater potential return on investment by simply doing the boring work of data cleanup before engaging consulting.

The areas for free improvement should begin by answering the following questions:

  1. Does the vendor address data found in the ERP match the address for the vendor found on the actual invoice image?
  2. Is the vendor-defined in the ERP designated as a valid pay site?
  3. In the vendor data, are intercompany and employee vendors correctly marked/identified?
  4. Do you know the basic characteristics of a PO number used by your company?
  5. Are the vendors simply sending bad quality invoice images?

Vendor Address Considerations

The absolute biggest free boost that a customer can do to increase extraction is to actually look at the invoice image for the vendor, and compare the address found on the invoice to the information stored in the ERP.  WFR looks at the zip code and address line as key information points.  Mismatches in the ERP data will lower the extraction success rate.  This affects both PO and non-PO invoice vendor extraction from an invoice.

To illustrate this point at a high level, let’s use some basic data tools found within the Oracle database for testing.  The “utl_match” packages will work to get a basic feel for how seemingly minor string differences can affect calculations.

Using utl.match.edit_distance_similarity in a simple query, two strings can be compared as to how similar the first string is to the second.  A higher return value indicates a closer match.

  • This first example shows the result when a string (“expresso”) is compared to itself, which unsurprisingly returns 100.

  • Changing just one letter can affect the calculation in a negative direction. Here, the second letter of the word is changed from an “x” to an “s”.  Note the decrease in the calculation.

  • The case in the words can matter to a degree as well for this comparison. Simply changing the first letter to uppercase will result in a similar reduction.

  • Using the Jaro Winkler function, which tries to account for data entry errors, the results are slightly better when changing from “x” to “s”.

Let’s now move away from theory.  In more of a real-world example, consider the following zip code strings, where the first zip code is a zip + 4 that may be found on the invoice by WFR, and the second zip code is the actual value recorded in the ERP.

In the distance similarity test, the determination is that the strings are 50/50 in resemblance.

However, Jaro Winkler is a bit more forgiving.  There is a difference, but it’s closer to matching both values.

The illustrations above are purely representative and do not reflect the exact process used by WFR to assign “confidence”.  However, it’s a very good illustration to visually highlight the impact of data accuracy.

The takeaway from this ERP data quality discussion should be that small differences in data between what appears on the invoice compared to the data found in the ERP matters.  This data cleanup is “free” in the sense that the customer can (and should) undertake this operation without using consulting dollars.

Both the Inspyrus and Oracle Accelerator implementation of the WFR project leverage a custom vendor view in the ERP.

  • Making sure this view returns all of the valid vendors is critical for correct identification of the vendor. A vendor that is not found in this view cannot be found by WFR – plain and simple since the WFR process collects and stores the vendor information for processing.
  • Also, be sure in this view to filter out intercompany and employee vendor records. These vendor types are typically handled differently, and the addresses of these kinds of vendors typically appear as the bill-to address on an invoice.  Your company address appearing multiple times on the invoice can lead to false positives.
  • In EBS, there is a concept of “pay sites”. A “pay site” is where the vendor/vendor site combination is valid for accepting payments and purchases.  Be sure to either configure the vendor/vendor site combination as a pay site, or look to remove the vendor from the vendor view.

PO Number Considerations

On a similar path, take a good look at your purchase order number information.  WFR operates on the concept of looking for string patterns that may/may not be representative of your organization’s PO number structure.  For example, when describing the characteristics of your company’s PO numbers, these are some basic questions you should answer:

  • How long are our PO numbers? 3 characters? 4 characters? 5 or more characters? A mix?  What is that mix?
  • Do our PO numbers contain just digits? Or letters and digits? Other special characters?
  • Do our PO numbers start with a certain sequence? For example, do our PO numbers always start with 2 random letters? Or two fixed letters like “AB”? Or three characters like “X2Z”?

Answering this seeming basic set of questions allows WFR to be configured to only consider the valid combinations.

  • By discarding the noise candidates, better identification and extraction of PO number data can occur.
  • More accurate PO number extractions can lead to increased efficiency inline data extraction, since the PO data from the ERP can be leveraged/paired, and can lead to better vendor extraction since the vendor can be set based on the PO number.

Avoid trying to be too general with this exercise.  Trying to cast too wide of a net will actually make things worse.  Simply saying “our PO numbers are all numbers 7 to 10 digits long” will result in configurations that pick up zip codes, telephone numbers, and other noise strings. If the number of variations is too many, concentrate on vendors using the 80/20 rule, where 80% of the invoices come from 20% of the vendor base.

General Invoice Quality

Now, one might think “I cannot tell the vendor what kind of invoice to send.”  That’s not an accurate statement at all.  If explained correctly, and provided with a proper incentive, the vendor will typically work to send better invoices.  WFR is very forgiving, but not perfect, and looking at the items in the following list will help.

  • Concentrate initially on the vendors who send in high volumes of invoices.
  • Make sure the invoices are good quality images containing no extra markings on the image that is covering key data, like PO numbers, invoice numbers, dates, total amount, etc.
  • Types of marks could be handwriting, customs stamps, tax identification stamps, mailroom stamps, or other non-typed or machine-generated characters. Dirty rollers on scanners can leave a line across the image.

Hopefully, this article will give an idea of the free things that can be done to increase the efficiency of WFR.

Want to learn more? Contact us today!

How to use Visual Builder to Create Public Facing Functionality for Sites

By: Courtney Dooley | Technical Architect

 

Content and Experience Cloud form functionality such as Contact Us, Feedback, and Survey information is not offered out of the box.  In fact, developing these forms and functionality can sometimes require additional services to be purchased or custom development to be implemented.  But if you have Integration Cloud Service, you may not realize that Visual Builder offers a publically accessible form and process that can be used by sites built within Oracle Content and Experience Cloud.

Visual Builder

Visual Builder is a Platform as a Service (PaaS) cloud-based solution that offers the ability to create Web Applications, Mobile Applications, define Service Connections and even integrate with Process Cloud.  Although many of these functions will require authentication, Visual Builder does have the unique option for publically accessible applications.  In the Feedback use case, we will use Business Objects to define and handle the feedback functionality for public-facing sites.  Although this functionality could be handled using a Web or Mobile Application, business objects are quick to set up and configure.

Building Options

The main menu of Visual Builder displays the options below.

  • Mobile Applications
  • Web Applications
  • Service Connections
  • Business Objects
  • Components
  • Processes – Integration with Oracle Integration Cloud Process Applications

For both the Mobile and Web Applications, form development and data structure is available for customization and modification to meet the needs of any service.

Additional services can be configured within the Service Connections then called by a form function or workflow.  These services can be selected from a catalog of predefined services, a specification document that defines the service, or by specifying the endpoint for the service.

Components are elements which can be added to a form such as Images, Text, Buttons, Menus, and Links.  Field types such as dropdowns, text inputs, rich text, and specific field types such as Currency, Email, Phone etc. are all available out of the box.

Business Objects

A quick and easy way to create a public service is by creating a Business Object.

  • Overview – besides general properties, relationships can be established to other business objects for other services.
  • Fields – define information to be received and used within the service including audit fields such as creationDate, createdBy etc.
  • Security – set the authentication needed for the service. In the case of a public service, selecting Anonymous User permissions allow for public access.

  • Business Rules – define how to handle the information being provided, below are the types of handlers which can be defined.
    • Object Triggers – we will use this one in our Feedback Use Case
    • Field Triggers
    • Object Validators
    • Field Validators
    • Object Functions
  • Endpoints – a base set of API endpoints created automatically when the Business Object is created

 

  • Data – shows all processed data for development, staging, and live processes including the ability to query specific data.

 

Feedback Use Case

For a simple Feedback Form that can be made public in Content and Experience Sites, we created the Business Object, as described in the previous section.  We then specify the fields we expect from the Feedback form and configure their properties for requirement, uniqueness, and searchability.

Lastly, we add an Object Triggered business rule that executes before a new feedback form is inserted.  This Business Rule will simply send the feedback data to a specific email inbox.

 

New Actions can be added by clicking on the plus sign within the process flow diagram, then configuring the action to take.

The Email information can be configured by clicking the edit pencil on the Action.  The Email address can be a set value as shown below, or it can be an expression where the value is derived from a service or other data.

Once the business object is configured and saved, the form to present on the site can be created one of two ways.

  1. Create a Web Application that provides the form and on submit inserts the business object which will process the notification. This form would then be presented to users via I-Frame.
  2. Create the form on a Content and Experience Cloud layout or custom component which calls the Visual Builder Cloud Service API for that business object on submit.

The Feedback service will not be available until it has been Staged then Deployed, but once deployed, it should be available for use on any public-facing site.

Contact us for more tips and tricks on developing Oracle Visual Builder Cloud Service Applications!

How to Extract Your PO Numbers Consistently in Oracle’s Forms Recognition AP Project

By: William Phelps | Senior Technical Architect

One of the thornier issues when working with Oracle’s Forms Recognition Accounts Payable (“AP”) project is simply and correctly determining and extracting a correct purchase order number from the invoice image.  This seemingly mundane task is further complicated when the purchase order number is a mere simple string of digits, much like, and sometimes confused with, telephone numbers, serial numbers, shipment numbers, and similar purely numeric strings found on the invoice.

This is a common problem for many companies using the AP Solution project, and it’s a fair bet that if you are reading this article, your company has the same or similar issue.

Let’s note upfront that there is no one magic solution bullet that will fix all extraction problems.  This article is intended as a fine-tuning methodology once very basic solutions and ERP data cleanup has occurred.  It’s at that point, when the easy stuff has been done that any additional techniques should be applied.  (A certified partner can help make these advanced changes with less overall effort and better end results.)

In general terms, the Oracle AP project provides a process called “PO masking” to allow the customer to tell the software about the general characteristics of their PO number structure.  This approach uses somewhat simple regular expressions (or “masks”) to derive potential strings deemed to be viable PO number “candidates” that it encounters while parsing the invoice text.  This kind of generalized setup almost always produces extraneous candidates.  Often it’s further determined by the process that, from this list of candidates that it extracts, some candidates are deemed a better match based on where the string is found in the document.  It places a lower ranking, called “weighting”, on candidates that may be embedded within the body of the invoice, like the case when the PO number is listed within a line description, and instead places a higher “weight” on a wrong value near the page header or top of the invoice.

A somewhat more educated and targeted way to help Forms Recognition get to that right value will involve an additional detailed look at the list of potential candidates.  During this further programmatic inspection, we can try removing or reducing the “weights” of those potential candidates that we think are misses by using true regular expressions in Visual Basic.

For a very simple example, a given operating unit may have only a handful of unique patterns for their PO numbers. Wide, generalized mask definitions intended for multiple operating units will likely result in more misses.

In WFR using the Inspyrus/Solution Accelerator PO header view (“xx_ofr_po_header_v”), the operating unit is available in the view alongside the PO number.  Using this information indirectly, the PO candidate weights can be altered to increase the accuracy of the extraction.

In these cases, the incoming invoice should be coming from a process that is pre-assigning the correct operating unit.  Since we will know the general PO number patterns for each operating unit, the list of extracted potentials can then be whittled down to a very precise list. (The real work is in determining the exact regular expression per operating unit, which is beyond the scope of this post.)

For today’s example,

  • Open the AP Solution project in WFR Designer and edit the script for the Invoices class.
  • On the UserExits script page, add the following function at the very bottom of the sheet. (Be sure to only add custom code in designated or legal areas of the script page for supportability.)

Then, in “UserExitPONUmberPostEvaluate” on the same script sheet, update the subroutine with the PO filtering code below:

Save the project file and try processing those problem vendors and purchase order numbers again.

Variations of this code have been deployed at several customers, resulting in much-improved PO number extraction rates.  This increased extraction success rate translates into less manual correction and increased invoice processing throughput since PO lines can then also be paired with a greater success rate automatically.

As noted earlier, a certified partner can help make these kinds of advanced changes with less overall effort and better end results.

Contact us if this express lane to regular payments sounds like a great idea!

Integrating with Salesforce using Oracle Integration Cloud

By: Courtney Dooley | Technical Architect

With all of the available integration options, it’s easy to overlook or undervalue tools that are offered to make these integrations easier.  In fact, many of these offerings are not nearly as helpful as they appear to be.  Oracle’s Integration Cloud offers a Salesforce adapter that really minimizes the development required to set up a simple integration with any other system or service.

A Simple Use Case

Salesforce Opportunities often result in a contract for products and/or services.  These contracts are often managed or produced using a contract management tool which processes approvals and renditions before the final contract is sent for customer signature.  Oracle Integration Cloud includes Process Cloud as workflow approval process engine, and tightly interacts with integrations to any number of systems and services.  Although a contract management solution can be easily built within Oracle Integration Cloud Process Applications; for this use case we will use Atlassian’s Jira on-premise service.

Jira offers a built-in REST API library that allows for easy integration to create, get, or delete issues.  For this reason, we do not need an Atlassian Jira Adapter, but can use the out-of-the-box REST API adapter.

Salesforce integrations can be triggered either by a workflow action outbound message or by simply calling the integration from a button.  For the integration to be triggered by an outbound message, the outbound message WSDL is required.  The workflow action will send not only the Opportunity ID but also other field data when triggering the integration.

For our use case, we did not have a specific set of field data that would indicate when the integration would be triggered and although custom links can trigger the outbound message, we went with the option for a button that could be used at any point in the Opportunity life cycle and is easily found alongside the other Opportunity buttons.

When triggered the integration retrieves the opportunity details, checks Jira for existing contracts issues that are linked with the Opportunity (this can be tracked within Jira or Salesforce) and based on the information the integration has acquired makes another REST API call to create a new issue in Jira or returns the existing Jira Contract information.  We could also update the existing contract with the information from the Opportunity.

 

Need to Know

  1. Connector Requirements

In order to create a connector in Oracle Integration Cloud Service for Salesforce, you will need an integration user to authenticate with and has access to all Opportunities within Salesforce.  You will also need access to the Salesforce environment to create an Enterprise WSDL to identify the Salesforce service you are trying to integrate with.

Once the generated WSDL is downloaded and the user credentials have been set, including appending the user security token to the end of the user’s password. The connector can be created using the Salesforce Adapter.

 

Oracle Integration Cloud Salesforce Adapter

 New Connection Dialog Screen

 Salesforce Connector Configuration using the Salesforce Adapter

 

  1. Trigger vs Invoke

Once a connector has been configured and tested, it can be used either as a Trigger (which requires the Outbound Message WSDL), Invoke or both depending on how the connector was created.  When the Salesforce connector is used within integrations, the functionality available for use is displayed in the “Action” step of the setup.

 

  1. Salesforce Buttons

Two ways to trigger the integration using a button are to execute JavaScript on click, or execute a URL which calls the integration.  Below is an example using the URL option and returns a JSON response including the contract URL or existing contract message.

Other Service Connections

  1. Oracle Integration Cloud out-of-the-box Adapters

AutomationAnywhere – Robot Process Automation (RPA)

Concur

DB2

DocuSign

Eloqua

Google – Calendars, Emails, and Tasks

Microsoft – Calendar, Contacts, and Emails

JD Edwards EnterpriseOne

LinkedIn

MySQL

Oracle EBS

Oracle Database

Oracle DBaaS

Siebel

SurveyMonkey

Twitter

Workday

FTP

REST – for use with any system that has a REST API library

SOAP – for any system with a Soap API library

…AND MANY MORE!!!!

 

So as you can see, Oracle Integration Cloud offers many ways to integrate Salesforce with almost any system or service quickly and easily.  By developing simple integrations, you can eliminate the re-work of entering data into multiple systems, as well as keeping data aligned and your business in sync across all resources.

 

Contact us for more tips and tricks on developing Integrations using Oracle Integration Cloud!