Developers-first mindset arrives with Oracle 23c free version

The next Long-Term Support Release for Oracle Database, which will be Oracle 23c, casts its shadows ahead and comes with a major change concerning how Oracle releases its flagship database product: the first version of Oracle 23c will be provided as a free version, whose intended audience is the developer – Oracle 23c – Developer release

This announcement is a blast! But it also means a consistent evolvement of the strategy Oracle is following the last couple of years, e.g. by providing a comprehensive forever-free account for Oracle Cloud Infrastructure (OCI)

The intention of this release is the provide early access to developers to be able to use it in new projects, to test application compatibility, or to just play around with and provide early feedback back to Oracle Product Management. The overall goal is to further Developer adoption as well as Developer Experience. The Developer Release will be upward compatible with Oracle Enterprise Edition and Oracle Cloud databases. 

Oracle 23c free is quite similar to the long-time available Oracle XE version but also was a free version with certain limitations (storage limits, limited number of pluggable databases, etc.). XE intended to deliver a version with a minimum footprint that just provides essential features to run basic applications.

With Oracle 23c free – Developers release this approach changes as the Developer Release is going to be released first before e.g. the Enterprise edition releases will be available. In addition, there are changes concerning permitted features and options which can be traced in the documentation. Like with Oracle XE, Oracle 23c free can also be used in production scenarios, without any options for HA or professional support in case of errors of course. 

Another cool thing with this new free Developer Release is that it can be downloaded directly, without accepting any terms and conditions. Before, it was always required to accept terms and conditions before downloading software like Oracle XE, which was quite unhandy concerning database deployment automation. So, from a Developer Experience perspective, we have finally arrived at the present!

The new LTS release also contains several new features, where the most important innovation is the JSON Relational Duality. Using that feature changes the way of developing modern applications, as Developers can now harness the power of both relational and JSON paradigms with a single source of truth, experiencing the advantages of both models without the tradeoffs. With a unified data store, they can access, modify, and write data through either approach, ensuring consistency and eliminating the complexities of object-relational mappings. Furthermore, developers can now enjoy the benefits of ACID-compliant transactions and concurrency controls, providing them with a sophisticated solution to tackle data inconsistency issues. 

JSON Relational Durability is not the only new feature and if you wanted to have more details about them, go to Gerald Venzl’s technical announcement blog.

Oracle 23c – Developer Release is available today! There’re different options to getting started with the new release, as it is available in form of:

I was quite surprised by this announcement and need to mention that I am delighted to see this move. From my perspective, it is a win-win deal, because Oracle aligns with the needs of the developers and gains the opportunity to collect feedback in an early stage, which can be used to improve their products more quickly, as well as the fact that this strategy may further technology adoption. I am curious how the story continues.


Back to in-person – Conference contributions in September

Finally, we’re back to in-person events! In September, I had a very busy month doing different presentations at various conferences. I wanted to use this short blog to collect the presentations, I delivered and link to the respective slides, in case you missed the presentations, but might be interested in the materials presented during the talks.

Digital Exchange Bergisches Rheinland 2022 (DIX 2022)

Promoting innovation, establishing networks, shaping transformation that’s the overall goal of Digital Exchange, which is a regional one day conference in the “Bergische Rheinland“, in the state of North Rhine-Westphalia. With over 700 attendees and 80 presentations it was a fully-packed day, with lots of opportunities to learn new and interesting things as well as to exchange ideas or simply meet people.

At DIX 2022, I delivered two sessions:

DOAG 2022

DOAG is the main conference of the German Oracle User Group. With about 1000 participants, the largest Oracle user conference in Europe took place again on site, in the city of Nuremberg, after a two-year break. This year, we had one special theme day, shaped by the different communities within the user group, two classic conference days and one training day.

At DOAG 2022, I did the following presentations:

I addition, I was part of a Panel discussion about challenges and experiences in Cloud transitions projects.

Kong Summit 2022

My highlight so far this year was Kong Summit that took place in San Francisco. At this 2 day conference, with almost 500 attendees with 75 speakers delivering various session, I did one session about how to implement a consistent Observability (o11y) strategy in Microservices architectures without changing your service implementations. If you wanted to learn, what happened at Kong Summit, you can read that in my post in our OPITZ CONSULTING company blog.

For my personally the event had a special surprise, as I was named Kong Champion of the year. This was a great honor for me and shows me that the community activities I did in the past months are valuable.

Helidon – Java-based Cloud-native application development

In my last post, I wrote about Cloud-native app development in the Oracle Cloud (respectively Oracle Cloud Infrastructure, OCI). From that, we’ve now a rough idea about Cloud-native development, the principles behind it and what OCI offers to support Cloud-native applications. In this post – as promised in the last post -, I’ll give a brief introduction to Helidon, an open source framework that is mainly driven by Oracle.

Helidon – One framework, two implementation-styles

Helidon is basically a set of libraries for implementing Java-based Microservices resp. Cloud-native apps. So far, so good, but what’s the differentiating factor, since there are – as the figure below shows – a lot of those frameworks around these days?

Screen Shot 2020-04-05 at 20.38.42.png

Between those frameworks, we can distinguish Full-Stack, MicroProfile Based and Microframeworks.

The Full-Stack frameworks provide everything what is needed to implement a Microservice starting with accessing persistence services, implementing  business logic, exposing a respective REST service until providing a respective UI. Spring Boot is the most popular framework in this category, coming with a huge ecosystem for implementing various use cases.

The next category of frameworks are the MicroProfile ones. MicroProfile is a community-driven specification for the development of Java-based Microservices. It is hosted by the Eclipse Foundation and comprises a collection of individual specifications, partly borrowed from the classic Java EE or Jakarta EE space. Huge advantage of the  MicroProfile spec: it has no reference implementation – it’s just specs and interfaces. This makes the specification process lightweight and enables short release cycles. Scope and ecosystem are less pronounced in MicroProfile frameworks than in Full-Stack frameworks, which may impose limitations for certain use cases. On the other hand it makes those frameworks more intuitive and easy for newcomers to get started.

Last, but not least there are the so called Microframeworks. They’re often characterized by a reactive, non-blocking architecture and are optimized for fast startup as well as processing times. So they’re also well suited to be used in Serverless resp. Function as a Service (FaaS) scenarios. Frameworks of this category (or at least the core of the respective frameworks) usually dispense with any form of implicit “framework magic” like dependency injection. However, developers then take responsibility for certain processes themselves, such as the correct initialization of class and object networks or claiming and releasing of certain resources. Depending on the application scenario, this increases the amount of boilerplate code and the testing effort. On the other hand, these frameworks are extremely lightweight and flexible due to few external dependencies.

With these categorizations in mind and as it can also be seen from the figure above, Helidon comes in two flavours: A MicroProfile Based (Helidon MP) and Microframework (Helidon SE) one. This also answers the question about the differentiating factor of the framework.

Helidon framework architecture

To support the two implementation flavours, Helidon needs a respective framework architecture that is depicted in the figure below.

Screen Shot 2020-04-12 at 14.25.12

Basis for both framework variants is Netty, an asynchronous, event-driven framework.

Helidon SE is the lightweight, reactive variant of the framework and consists of an reactive web server and features with respect to flexible configurations as well as security. Helidon SE supports a functional programming model. The web server is characterised by a simple, functional routing model and provides support for OpenTracing, Metrics and Healthchecks. So this variant of the framework is a perfect fit for implementing reactive REST-based services.

Helidon MP is a MicroProfile implementation, currently supports MicroProfile version 3.2 and is built upon the basic components of Helidon SE. This variant of the framework supports the Java EE specifications for CDI, JAX-RS and JSON-B as well as JSON-P. In addition, Helidon MP is extensible using CDI-Extensions. Currently, such extension are available for JPA, JTA and specific extensions for accessing OCI-Resources like storage. So, this variant of the framework can be used for more advanced use cases.

With respect to flexibility, the framework variants provides both REST and gRPC-style service exposure, where gRPC is an experimental feature in the current version of the framework.

From a runtime perspective, Heldion runs on top of JDK 8+. In addition, Helidon SE applications provide support for GraalVM and can be compiled to GraalVM Native Images. The latter one is very interesting, especially to lower startup times and decrease the resulting Docker image’s size.


Helidon is flying! The current version is 1.4.4, a minor bugfix release comes nearly every month and in parallel the team is working on the next major release 2.0.0, which will drop JDK 8 support and will completely move to JDK 11 APIs. Furthermore, the following features will be added to the framework:

  • Reactive DB Client implementation (Helidon SE)
  • New reactive Web Client
  • Helidon CLI
  • MicroProfile Reactive Streams Operators (Helidon MP)
  • MicroProfile Reactive Messaging (Helidon MP)
  • WebSocket support
  • jlink image support
  • Preliminary GraalVM native-image support for Helidon MP

Some of the aforementioned features do have an experimental character and are suspect to change in case of further releases. Please checkout the release notes for Helidon 2.0.0-M1 and Helidon 2.0.0-M2. Within the release notes you can find some useful links to further blog posts, explaining some of the features in depth.


Helidon is a very interesting framework, when it comes to developing Cloud-native applications on top of the Java ecosystem.

From a developer’s point of view, Helidon’s approach is highly exciting: Developers can decide on a variant depending on the use case. The basic framework remains unchanged, only the programming model changes. This makes development more efficient and offers more flexibility in implementing business requirements.

In upcoming blog posts, I’ll go into more detail about developing services using Helidon and also how to run Cloud-native Helidon apps on top of Oracle Cloud infrastructure.

Cloud-native app development with Oracle Cloud

Cloud-native is a very popular keyword nowadays. But is it just another hype topic? My personal opinion is: No, it isn’t. Cloud-native development is essential to build sustainable, future-oriented architectures (in this context we also speak of evolutionary architectures) that can deal with the volatile, rapidly changing requirements with respect to products and business models! But what does Cloud-native mean?

According to the definition of the Cloud Native Computing Foundation (CNCF), Cloud-native apps are loosely-coupled, resilient, manageable and observable. To meet the requirements to quickly react on changed business requirements, a robust and consistent automation strategy (CI/CD). Technological Cloud-native apps massively count on containerisation. Conceptually such apps rely on modern concepts like Microservices, APIs, DevOps and 12-factor app.

To make it more concrete: Cloud-native apps are built with a Cloud-first mindset. In addition, those apps should not depend on a specific tooling or vendor so that it can be both deployed in the Cloud and on-premises as well as in the Cloud of vendor A and vendor B without changing the implementation. Technologies like Kubernetes (and of course other technologies specifically certified by CNCF) are key therefore.

From my perspective, the ideas behind Cloud-native should be the basis for any app that is developed nowadays!

Today, every Cloud vendor provides Cloud-native services; at least all of them provide a managed Kubernetes offering. With a special look to the Oracle Cloud, developers are provided with a complete Cloud-native development stack.

Screen Shot 2020-03-29 at 22.08.08.png The figure above shows what Oracle Cloud Infrastructure offers in the area of Cloud-native Development. In the following, I’ll give a brief introduction to the services with special focus on the App development & Ops services.

As mentioned before, Oracle – as the others also do – offers Managed Kubernetes offering called OKE. Supplementary to this we have Cloud Infrastructure Registry (OCIR), which is a private Docker Registry. So developing apps to be deployed to OKE can be pushed to this registry instead of public Docker registries like Docker Hub.

With Oracle Functions there’s also a Serveless/FaaS offering. Functions are built using Fn Project, which is a quite interesting project as such. Fn allows developers to completely develop and test functions locally. This is possible due to the flexible architecture of the functions runtime. Function apps are wrapped in Docker images that can be managed in a Docker registry (e.g. OCIR) and executed within the Functions runtime. The Functions server architecture allows to have it on the local development machine, in the local datacenter or in the Cloud. A very flexible approach.

To able to access Functions provided by in the Oracle Cloud, a HTTP Endpoint needs to be securely exposed so that the function can only be called by authorized clients. To ensure that the OCI API Gateway can be used. The gateway component is completely managed by Oracle, users just have to provision it, provision the respective APIs and the corresponding policies and use it. That’s it!

Provisioning of the gateway, the APIs and policies can be fully automated by using an Infrastructure as Code approach (IaC) with Terraform (which is also the case for most of the Cloud-native Services). With respect to IaC, the Resource Manager Service is provided, which supports you with provisioning all Cloud-native Services.

In addition to the aforementioned services, Oracle is also providing a Cloud-based development runtime, the Developer Cloud Service. This service provides a complete development environment (with exception of the IDE) and amongst others contains a GIT repo, an artifact repo, Kanban boards, a Wiki and a Build server. Setting up a new project can be done within 1-2 minutes.

With Oracle Helidon (where I will give a brief introduction in an upcoming post), a Microservice development framework is available, which can be used in two flavours: a  MicroProfile-based approach and in a more functional-based Microframework way. So the framework is very flexible to address different requirements.

From a Observability and Messaging perspective, the Logging and the Streaming Service are the most relevant ones from my perspective. The Logging Service (which currently is in limited availability) is a centralized Log Management with respect to the provisioned Service within a users Cloud tenant. This kind of functionality is very important, because of the distributed, loosley-coupled way Cloud-native apps are usually build.

The Streaming Service Service is basically a Cloud-native, Kafka-compatible, Event Hub implementation. It is designed for high throughput with the intention to handle large data streams that for instance may occur in IoT scenarios.

As you can see, Oracle has a solid foundation to build and run Cloud-native applications in the Cloud. More details about the aforementioned service can be found on the official landing page.

In upcoming posts, I’ll dig a little deeper into the different services. I’ll also show how the services can be combined together to build the foundation of a Cloud-native runtime and development platform.







Oracle Open World Wrap up: Autonomous Cloud platform to built intelligent Cloud Native apps

Oracle Open World and Code One are just over, so it’s the ideal time to reflect what happened during the days at the conference. The big things of this years conference were:

  • New Data centers
  • Autonomous Database enhancements
  • Autonomous Linux
  • Partnerships with Microsoft & VM Ware
  • Intelligent Apps development (Powered by ML/AI)
  • Cloud Native

In the following sections let’s take a closer look with regards to the aforementioned topics.

New Data centers

Besides that Oracle showed how aggressively they’re building out new Gen-2 data center regions, to catch up with the main competitor AWS.

Neue Datei 2019-09-13 08.47.27_1.jpg

By the end of this year it is planned to have 19 data centers all over the world, for both Commercial and Governmental customers.

Autonomous Database enhancements

The Autonomous Database (ADB) is Oracle’s flagship product for Data Management in the Cloud. It is nothing new, but a lot of new enhancements are currently happening or going to happen in the near future.

Besides features like automatic indexing, self-scaling and built-in Machine Learning capabilities, one of the most important messages was that ADB will be evolved in the direction of a Multi-model Data Management Platform. Multi-model means that besides relational data there will be support for JSON, Key Value, Graph, Spatial and Files. Saying that, it was announced that a new Autonomous JSON database will be available that will coexist with Oracle Autonomous Transaction Processing and Autonomous Data Warehouse.

To run Production workloads in a secure and isolated way, Autonomous Database Dedicated was announced. This basically means that such database tenants are isolated and run on dedicated Exadata Cloud infrastructure. This model feels like a fully isolated private Cloud in the Public Cloud and comes with a guaranteed availability of 99.995%.

Since ADB will be the central Data Management Platform within Oracle Cloud, Security is another hot topic. To further security, Oracle Data Safe was announced, which is the new, unified database security Control Center and comes with features like Security configuration assessment, User Activity Auditing and Data masking. This new offering is free for Cloud Databases and can also be used for On-prem Databases (for those you’re charged). With that you have kind of a hybrid, unified and central Security Control Center, which is quite cool from my perspective.

One of the biggest announcements for sure was the Free Cloud tier. This includes an always free Autonomous Data (2 Micro instances with 20 GB storage and 1 OCPU per instance). In addition to that you get the full development experience because it includes APEX, ORDS, SQL Developer Web, Machine Learning Notebooks. Having said that: Finally also APEX arrived on Autonomous Database. APEX is a low-code development platform to rapidly develop DB-centric applications.

Autonomous Linux

Taking the autonomous strategy to the next level, Oracle announced Autonomous Linux, along with the new Oracle OS Management Service, which the first and only autonomous operating system offering that eliminates complexity and human error.

You can get more information about these new offerings in the official Press release:

Partnerships with Microsoft and VM Ware

It is good to see that Oracle is further opening themselves embracing Open Source and collaborate with other companies. Just before Open World a partnership with Microsoft was announced and during Open World more details about this partnership were announced, which includes:

  • Running MSSQL on Oracle Cloud
  • Running Oracle Ecosystem on Azure (Database, Apps, Linux, Java on Azure, WebLogic)
  • Oracle provides license mobility for Oracle Software from On-prem to Azure

Besides that on some data centers there’ll be Oracle Gen-2 infrastructure co-located to Azure infra, which means less latency.

Oracle on VM Ware virtualization was always on of the most annoying topics for Oracle customers. Now Oracle and VM Ware announced a partnership where customers will be enabled to run VM Ware workloads on Oracle Cloud and furthermore Oracle will provide technical support for customers to run Oracle products on VM Ware virtualization. Read more about this in the official VM Ware press release:

Intelligent Apps development

Intelligent apps are supposed to deliver a next gen User experience by supporting users in an intelligent and convenient way.

In this area the Oracle Digital Assistant (ODA) platform, which is officially there since last years Open World, is one important thing. ODA allows you to create intelligent, conversational apps. Intelligence is derived from the existing data using AI and ML capabilities.

One big announcement for ODA was the upcoming support for Voice. On top of that it was announced that ODA will be enabled to understand specific Enterprise vocabulary using semantic pattern matching, so that the bot is able to better understand specific user intents.

ODA is constantly evolving and is also used by Oracle’s SaaS offering like HCM or ERP (pre-built skills). To allow ODA to connect and talk to other systems, Oracle Integration Cloud (OIC) can be used, which comes with OOTB business accelerators. Those are basically pre-built integration recipes that can be used and adjusted to a certain use case if needed.

Neue Datei 2019-09-13 09.33.53_1.jpg

Content Management for ODA solutions can be done in Oracle Content and Experience Cloud (OCE), which has been evolved over to a multi-channel, intelligent Content Hub.

Besides Conversational apps, powered by ODA, Progressive Web App development is also a very interesting and relevant topic, especially when it comes to efficient software development. For that Oracle has Visual Builder Cloud Service sided by Oracle Developer Cloud Service (which is basically free). One announcement here was that those two services will become more integrated (Visual Builder Studio/Platform) to further Developer productivity and to provide an outstanding developer experience.

Neue Datei 2019-09-17 15.44.03_1.jpg

Cloud Native

Cloud Native was an omni-present topic especially at Oracle Code One. Here we had a lot presentations with respect to Kubernetes, Microservice development, Function as a Service, Reactive app development, etc. The interesting thing to see was that Java is still relevant and as a programming language is constantly evolving. In the Java Keynote the GA of Java 13 was announced. Furthermore there are tons of Java-based framework to develop Microservices around (MicroProfile-based (Helidon), Quarkus, Micronaut).

From my perspective new applications today should be developed using Cloud Native technologies adhering the respective design principles (12-factor app). So, I recommend to make  yourselves familliar with those concepts and technologies. A great source for that is the CNCF website (

Oracle itself has an impressive amount of Cloud Native OCI services, like Event Streaming or Functions (based on Project Fn), which become constantly improved and integrated with each other. The idea is to build and run scalable apps in public, private and hybrid Clouds. The philosophy for those services is to provide a completly managed Cloud Native development stack based on leading Open Source technologies that are certified by CNCF. In this area new services were announced Logging Service, for centralized log management, and a native OCI API Gateway. The later one is a fully Oracle-managed API Gateway.

Neue Datei 2019-09-14 09.56.31_1.jpg

By the way – What about On-prem?

Besides all the noise around Cloud, it is good to see that also the On-prem offering is further evolved in parallel. Shortly, we can expect a new Patchset for WebLogic Server (, which brings in some enhancements and security fixes.

Furthermore it was announced that WebLogic Server 14.1.1 is currently under development and will be out in this calendar year. This version is expected to fully support Jakarta EE 8 and runs on top of JDK8 resp. JDK13. There will also be support for Middleware components like SOA Suite or Servicebus with upcoming releases of Weblogic Server (14.1.2).


Puuuuh… A lot of stuff is obviously going on. Oracle is moving forward and has – at least from my perspective – a very strong vision where the Cloud should go to. I am glad to see that, especially with respect to the newly announced partnerships and the constantly evolving adoption of Open Source technologies and also give things back to the Open Source community (Helidon, Oracle JET, Java EE -> Jakarta EE).





Quick steps for setting up a Gateway Node in Oracle API Cloud Service

Oracle API Platform Cloud Service (APIP CS) is an API Management Platform for covering the complete API lifecycle. A general overview about the solution is provided in one of my previous blog posts.

In this blog post, I’ll summarize the steps that are needed to setup a first API Gateway Gateway Node.

Logical Gateway and Gateway Nodes

Before getting started with the Gateway setup, a basic concept needs to be clarified.

Oracle APIP CS support the concept of a so called Logical Gateway, which depicts a logical configuration and management unit for the several Gateway Nodes. A Gateway Node is a physical representation of a API Gateway. It is the runtime component, where APIs are exposed to the outside world and where the defined API policies are enforced, when an API is called by a client.

From a subscription perspective the number of Logical Gateways is the relevant criterium with respect to the occurring costs. No matter, on how many Gateway Nodes are registered to a Logical Gateway.



Before getting started with the installation, a respective Compute Node (OCI, AWS, Azure, On-Premise) instance is needed on which the Gateway Node should be deployed. In my case, I used a OCI Compute instance, which I setup using the OCI console. The general system requirements for the target machine can be found in the documentation.

Create needed users

As mentioned in the documentation, a prerequisite for the API Gateway deployment, is the availability of the following two users:

  • Gateway Manager user, who is responsible for managing the Gateway and needs to be assigned to the Gateway Manager role
  • Gateway Runtime user, who is responsible for the interaction between Gateway Node and Management Service and needs to be assigned to the Gateway Runtime role

Those two users need to be created by an Identity Domain administrator using the User section in the Service Dashboard.


After user creation, the respective roles need to be assigned in the user’s details.

Define the Logical Gateway

In a first step, I used the Oracle APIP CS Management Portal, I created a new Logical Gateway and named it “Development Gateway”.

In the Logical Gateway Nodes section, the Gateway Node installer can be downloaded.


In addition, the page provides the “Open installation wizard” button, which is useful to create an initial Gateway installation configuration (gateway-props.json) for the specific Logical Gateway.

In the Grants section of the Logical Gateway section, the following grants needs to be defined for the two previously created users:

  • Gateway Manager grant to the Gateway Manager user
  • Node Service account grant to the Gateway Runtime user



Install the Gateway

After downloading the Gateway Installer, I copied this to my previously configured OCI Compute instance, connected to the instance via SSH and unzipped the installer to /u01/installer.

sudo mkdir -p /u01/apics
sudo mkdir -p /u01/installer

sudo chown -R opc /u01

unzip -d /u01/installer

After that, I replaced the file /u01/installer/gateway-props.json with one I created using the Installation wizard using the APIP CS Management Portal.

Before the Gateway installation can be started, a valid Oracle JDK need to be installed and the JAVA_HOME environment needs to be set appropriately.

sudo mkdir -p /usr/java
curl -v -j -k -L -H "Cookie: oraclelicense=accept-securebackup-cookie" > /usr/java/jdk-8u131-linux-x64.rpm

sudo rpm -ivh /usr/java/jdk-8u131-linux-x64.rpm

export JAVA_HOME=/usr/java/jdk1.8.0_131

After that preparation steps, the Gateway Node can be installed.

/u01/installer/APIGateway -f /u01/installer/gateway-props.json -a install-configure-start

During installation and configuration you’re prompted for a Weblogic domain username (Weblogic domain administrator), who will be created during this step. I called the user “weblogic” with a respective password.

Join the Gateway Node to the Logical Gateway

After the Gateway Node has been successfully installed and started, it needs to be registered with the previously created Logical Gateway.

/u01/installer/APIGateway -f /u01/installer/gateway-props.json -a join

While executing this step, you’re prompted for the usernames and passwords of the previously created Gateway Manager user and Gateway Runtime user.

In addition to the User credentials, the IDCS Client credentials for APIP CS also need to be passed. Those credentials, namely the Client Id and the Client Secret, can be found in the Platform settings section of the APIP CS Management Portal.


Approving the Gateway Node

After the Gateway has been joined successfully, it needs to be approved by a Gateway Manager, using the Management Portal.


After approving the Gateway and before deploying the first API, respective Load Balancer URLs need to be defined for the Logical Gateway instance. Since I  just have one Gateway Node, I set to the hostname of the Gateway Node.


Testing the API Gateway

For testing purposes, I just created a test API against a mock service that replies with the passed status code. The API definition is super simple, does a passthrough without further policy definitions.


To test the service quickly, I simply did an HTTP call via HTTPie.


This results in the following response:

HTTP 1.1 200 OK
Access-Control-Allow-Credentials: true
Access-Control-Allow-Origin: *
Content-Length: 0
Content-Type: text/html; charset=utf-8
Date: Fri, 10 May 2019 06:38:35 GMT
Referrer-Policy: no-referrer-when-downgrade
Server: nginx
X-Content-Type-Options: nosniff
X-Frame-Options: DENY
X-XSS-Protection: 1; mode=block

With that it’s proven that the Gateway has been deployed successfully and is working correctly.

News and noteworthy about Oracle PaaS – My thoughts on the emerging PaaS Partner Community Forum 2019

5 days packed with lots of information regarding the current product portfolio as well as the respective product strategies and that’s all for free? Where can you find an offering like that? Simple answer: At the annual Oracle Emerging PaaS Partner Community Forum!

This year the conference took place at Magaluf (Mallorca, Spain). This traditional event is a must-attend event for partners, since it provides outstanding possibilities
to share and exchange knowledge with other Oracle partners as well as the Product Management team. It also provides the chance, to give feedback Oracle’s Cloud Platform portfolio and the products itself.

The event is also intended to enable Partners to implement solutions on basis of the Oracle platform, by provide an interesting and valuable set of Hands-on labs (HOL) focussing the the latest Product versions and features.

The conference agenda

The first day of the forum belongs solely to the Partners, who did presentations on emerging technologies used in customers projects. Attendees were able to choose between presentations about SaaS and SaaS integration, technical topics like Digital Assistants or GraphQL and business-related topics, with session about Women in IT, as well. This day, with its built-your-own agenda concept was really cool with valuable information.

The second and third day were mainly focussing on the Oracle Cloud Platform portfolio and the latest evolvements and innovations within the portfolio. Where the second day gave a high-level and general overview to the platform-level strategy with respect to the following product areas:

  • Content and Experience Management
  • Serverless Functions and APIs
  • Digital Assistants
  • Integration, Process and Machine Learning (“Smart Processes”)
  • Artificial Intelligence

On the third day it gets more into the details of the aforementioned areas. The respective  breakout sessions could be chosen individually. The day was structured in four different rounds:

  • Enterprise process & integration track, where aspects like SaaS Integration patterns as well as SaaS Customization, the general Integration portfolio
    and SOA Suite migrations to the Cloud have been discussed
  • Development tools & DevOps track, with topics from the area of Application modernization with Client-side Javascript, Digital Assistants, Progressive app development with VBCS and
    DevOps based on the Oracle Cloud portfolio
  • Application development track, dealing with Microservice development with Helidon, Weblogic on Kubernetes, API Management and last but not least Serverless development with Oracle Functions
  • Innovation track, dealing with innovative solutions in the Content & Experience space (Integration with AI and ML), Enterprise Blockchain, Smart process (Dynamic processes supported by ML and Robotic Process automation)
    as well as Internet of Things

Traditionally, the conference closes with two days of Hand-on Labs (HOL). The offering here was overwhelming. 11(!) different Labs were offered covering everything that was discussed over the days before. Incredible offering and an amazing chance for partners to get their hands dirty on the latest stuff.

Summary and key takeaways

The annual PaaS Partner Community Forum is a first-class conference and as already mentioned a must-attend event for Oracle Partners, to get the latest information from Product Management and – even more important – to provide feedback on products, the portfolio and also the current strategy.

My personal key takeaways from this years conference are:

  • Oracle continuous their way to embrace and support Open Source technologies by leveraging respective technologies in their products, like Spark for Artificial Intelligence and Machine Learning or Hyperledger Fabric in the Blockchain area; Oracle also publishes frameworks like Oracle JET (Javascript UIs,, Fn Project (Serverless/FaaS, or Helidon (Microservices,
  • Kubernetes is the de-facto next-gen Application delivery platform
  • Don’t under-estimate the power and also weaknesses of GraphQL; there is no REST or GraphQL, simply use the one that best fits your use case; there are good reasons for a co-existence, e.g. GraphQL for implementing API compositions based on REST APIs
  • Digital Assistants are the latest and fanciest communication channels to engage with customers, partners and also internal colleagues (ChatOps, etc.)
  • Blockchain is becoming more and more relevant for enterprises with respect to transparent and secure End-2-End Business transactions; time to take a deeper look and discuss real-world cases with customers
  • Integration and workflow automation is not new, but it is still a first class citizen and provides the foundation for new concepts, like Digital Assistants; in addition, it all gets more and more integrated with AI and ML, with respect to Smart and predictive processes
  • Helidon is a serious alternative to Springboot for implementing Microservices
  • Oracle provides a good and solid Serverless foundation with Fn Project and Oracle Functions (to be GA’ed) and should be seriously considered

Last but no least I wanted to thank Jürgen Kress for an amazing event, which was perfectly organized and very valuable for all attendees. Thanks for building this amazing community of experts and provide an opportunity to share knowledge in such a great and professional atmosphere!



Autonomous, intelligent and open Cloud – An Oracle Open World and Code One Wrap-up

Oracle Open World 2018 is over, so it’s time to take a step back and replay about what happened during some interesting days fully-packed with great and useful information.

Oracle Gen2 Cloud Infrastructure – the big thing in IaaS

Oracle Gen2 Cloud Infrastructure (OCI), is intended to deliver a better Performance (Compute, Memory, Block Storage, Network) and a better Pricing to customers than the Gen1 infrastructure.

From an architectural perspective Oracle’s new Cloud infrastructure is more than just a facelift, since it has been re-designed from the ground up.


As the picture above shows, Oracle introduced a completely new tier: the Cloud Control Computers. These specific components, called the impenetrable barrier, run all Cloud control code. Before, the Cloud control code was co-located with all customer code, which was suspect to be less secure and vulnerable. The Cloud Control computers surround the Oracle Cloud infrastructure to protect the Cloud as such and additionally surround each customer zone. This leads to enhanced security and more data privacy.

In addition to the impenetrable barrier, Oracle introduced so called Autonomous Robots that detect and kill potential threats automatically. To be able to identify those threats, the Robots are empowered by Machine Learning algorithms and so protect the Oracle Gen2 Cloud infrastructure for attacks.

OCI is already available in most regions today and will also be available for Cloud@Customer in Summer 2019.

Oracle Autonomous Database

The Oracle Autonomous Database was already announced during last years Open World; now the vision seems to be compete. The Database can be used for implementing transaction-intense applications (OLTP) as well as for defining analytics applications (OLAP, Oracle Autonomous Data Warehouse) and leverages the new OCI infrastructure. In the context of the Autonomous Database autonomous Robots are responsible for:

  • Provisioning
  • Scaling
  • Tuning (tuning is constantly applied)
  • Recovery
  • Patch & update
  • Fault-tolerant failover
  • Backup & Recovery

Doing so, the Database is supposed to be more stable and available (Availability: 99.995%) and should allow Developers and Administrators to focus on more important questions with respect to data organisation and business logic.

From an architectural perspective, Oracle Autonomous Database is designed in a Serverless fashion, which means that customers only need to pay when data is actively processed. When the Servers are idle, nothing needs to be paid – with the exception of storage.

OCI Security announcements

The Security topic was very prominent this year. For the OCI infrastructure the following new announcements in this area were made:

  • Key Management Service– Store & Manage all encryption keys for all storage layers
  • Cloud Access Security Borker (CASB)– Automated, continuous security monitoring and management (e.g. configuration changes done by potential attackers)
  • Web Application Firewall– Web application traffic inspection
  • Distributed Denial of Service Protection– Automated DDoS attack detection and mitigation of high volume layer 3 & 4 attacks

With that the OCI offering becomes more secure and trustworthy, so that customers have less to worry about data security in the Cloud.

News and noteworthy from the SaaS and PaaS space

Oracle SaaS and PaaS solutions are leveraging from the innovations for Oracle Gen2 Cloud infrastructure, since the respective solutions are running upon the IaaS components.

Oracle SaaS

In the SaaS space, Oracle claims the market leader position, especially in the Cloud ERP space. Bringing existing on-premise customers to Oracle Fusion SaaS is something were Oracle is working on hard, to make this journey as easy as possible. In addition, it should be done at a very low-cost level and in the shortest possible period, which is depending on the number of customisations built in the existing on-premises solution.

Talking about customisations Larry Ellison said: “We love extensions, extensions are great! We have these great tools for extensions to our SaaS applications.”, and he further explained that customisations are not welcomed. From a long-term maintenance perspective, this is comprehensible.

With great tools, Ellison points amongst others to the integration accelerators that can be used to integrate Fusion SaaS apps and the respective data with other applications. Regarding data integration and analytics of the existing Fusion data Oracle introduces the brand new Fusion Analytics Data Warehouse that is build upon Oracle Autonomous Data Warehouse as well as the Oracle Analytics Cloud Service (PaaS), which is intended to make data analytics very easy and efficient by just pushing a button.


Oracle PaaS

Machine Learning (ML) and Artificial Intelligence (AI) seems to be very popular nowadays and omnipresent at a lot of presentations at Open World this year, as already mentioned in this post, while talking about OCI and the concept of the autonomous Robots.

That Oracle takes the topic seriously, shows itself also by the announced acquisition of the Cloud-based AI data engine company DataFox for undisclosed terms. The acquired tech will enhance Oracle Cloud Applications and the Data as a Service offering.

“Machine learning is a technology as revolutionary as the internet” (Larry Ellison, CTO Oracle)

ML and AI technologies (and therefore the Autonomous Data Warehouse, which provides the data basis) are also the basis for the newly announced Oracle Digital Assistant, which is the next evolution level of Chatbots and Intelligent Bots.

Other as the Chatbot or Intelligent Bot offering before, the Oracle Digital Assistant is a new standalone Service offerings and combines diverse so called skills for different business contexts under a common interface. This makes the User experience more consistent, since users have a single entry point to follow up with different user journeys, depending on their current context. Empowered by ML and AI the digital assistance knows, by analysing the information provided by the user, which skill to use to fulfil the current request. From a interface perspective, Oracle provides an app, but also supports integration with existing Services like Slack or Facebook messenger. In addition to that there a completely new support for Voice is available, which allows integration with existing voice assistants like Siri or Alexa.

With respect to Oracle Integration Cloud (OIC), we’ll see some new innovations also driven by ML and AI. For example in the Process Space there’ll be support for Dynamic Business Rules and next best action offerings in the area of dynamic processes and in the integration space integrations can be built more efficiently by providing intelligent recommendations for data mappings.

A new kid on the block in the Process and Integration space is Robotic Process Automation (RPA), where application integration is done by so called Robots (other than the autonomous Robots used by OCI) by basically leveraging the existing UI capabilities of an existing application to realise a certain integration scenario. The RPA technology can be used in cases where no appropriate UI is available and integrations needs to be established quickly. To implement RPA-based integrations a developer basically defines a UI Flow, similar to a Screencast, which is replayed by the Robot.

For developing and running the Robots, Oracle has established a cooperation with UIPath, a leading company in the RPA space. At Open World Oracle announced a new OIC RPA Adapter, which can be used to easily integrate with UIPath’s RPA solution, which makes the development of those solutions more efficient.

Cloud-native application development

Cloud-native application development denotes a modern approach to build and run applications by exploiting the advantages that Cloud and emerging technologies for developing modern applications deliver. Cloud-native applications embrace the 12-factor  principles, integrate concepts like DevOps, Continuous delivery and are often build on Container technologies.

Oracle also implement some of their Cloud offerings considering Cloud-native principles. While doing so they also share technologies and frameworks with the Open-Source Community, like the Oracle JET framework (the standard UI Framework used for Cloud UIs). With Fn Project Oracle last year open-sourced a framework for defining Functions-as-a-Service (FaaS) apps which are Docker-based and therefore can be executed vendor-agnostic.

At this years Open World Oracle introduced a new framework that was open-sourced just before Open World: Helidon. It is a framework to implement Microservices. It comes in two different flavours: MicroFramework, which is a lightweight and function-based variant, and MicroProfile, which supports MicroProfile version 1.1 and therefore comes with support for JEE features. So Helidon is a valid alternative to Spring Boot, when it comes to Microservices implementation on a Java basis.

During Open World 2018 Oracle Oracle announced 9 new Services to support Cloud-native application development, from Managed K8S, Kafka and Serverless, Orchestration, Telemetry, Notifications, Auto scaling and Cloud events.


The Orchestration Service for example aims at Infrastructure-as-Code, which is a very important thing for Cloud-native application development, since with that applications become even more independent from the runtime as it’s runtime is part of the software.


From a technology perspective topics like APIs,  Microservice technologies, like Service-Mesh with Istio or Envoy and Kubernetes as the Next-gen application development platform, were prominent citizens especially at Oracle Code One. In addition, the Kafka platform for real-time Data streaming and analytics, Serverless technologies and implementations as well as Machine Learning based on Open-Source technologies and frameworks were on the agenda.


This years Open World was mainly branded by the new Gen2 Infrastructure, the enhancements in this area and the autonomy of certain Oracle Cloud components, like the Database or the Data Warehouse. It seems that at least the Oracle IaaS stack is following a consistent vision and is becoming more mature. Also on the PaaS-level the available product palette seems to become more homogeneous and consistent, since everything converges together from a higher-level perspective. There are still some childhood illnesses, but maybe that’s just a matter of time.

Code One was a conference with many different facets, amazing presentations and awesome speakers. Here developers were able to share knowledge and exchange opinions, about how applications development should be done nowadays. It’s good to see that trend for embracing Open-Source technologies, which I already noticed last year, evolved further.

I am curious to see how the observed trends will develop further. Latest at Oracle Open World and Code One 2019, we’ll see how trend will look like.

DOAG 2017 contributions overview

With DOAG 2017 conference, I am done with my conference year. You can find the complete conference program here.


But – as always – after a conference is always before the next conference. With this in mind, I just quickly wanted to provide a quick overview on my contributions and where to find respective materials.

At this years conference I had 3 presentations, which I gave with with my congenial partner Danilo Schmiedel. We covered different topics all around strategies, guidelines and technologies on the area of digital transformation

Strategies for efficient Delivery with APIs, Containers, Microservices, DevOps

A central challenge for today’s IT is how to ensure business agility and how to keep the robustness of core business functionalities. A microservice-based architecture in conjunction with APIs for decoupling the building blocks becomes more and more important. By using modern container technologies, companies can leverage all the advantages of modern software development. This session covers the impact of modern software architectures based on a real-life scenario, which we implemented to establish faster delivery cycles and to start innovations. Learn different strategies for how modernized software architectures can be introduced. Slides are available on Slideshare.

Cloud meets On-Premises: Guidelines and Best Practices

Integrating distributed systems or microservices with each other has been a challenging topic over the past years. The architecture blueprint which will be explained in this session considers highlights and lessons learned from our real life experiences with hybrid and multi-cloud environments. As part of that we will demonstrate how to bring back architectural best practices into modern solutions that are suitable for organizations of any size and industry. Slides are available on Slideshare.

Dynamic Processes & DMN Accelerate Digital Transformation

Business Process Management (BPM) as a Service, that is what the Oracle Process Cloud Service (PCS) is about. It provides tools to collaboratively create business processes, forms, rules, documents, services and data in a zero-code web-based environment. Of course BPM is not a new topic and quite a few initiatives failed in the last decade. However, with dynamic processes and decision modeling, Oracle PCS provides two very important enrichments which avoid the main challenges from the past in order to deliver real value to the knowledge workers. But is BPM dead or is it still relevant with respect to digital transformation? This is a question, we’re trying to clarify. Slides are available on Slideshare.

In summary, we had very interesting sessions at the conference, with great feedback and discussions. I am really looking forward for the next year. Until then, I’d like to say thank you and see you next year in Nürnberg.

My Oracle Open World 2017 contributions in a nutshell

OOW 17 is behind us and is was again an amazing and informative experience to attend one of the world biggest and relevant IT conferences.

Time to give you an overview on contributions and outcomes I was involved in. In total, I delivered 4 sessions:

  • Modernize Your IT Landscape with API-Driven Architectures (delivered 2 times, once at OOW and once at JavaOne Oracle Code track)
  • Strategies for Efficient Delivery with APIs, Containers, Microservices, DevOps
  • Soaring Through the Clouds: Live Demo of 17 Oracle PaaS Services Working Together

In addition, my colleague Danilo Schmiedel and me did an Dev interview with Bob Rhubart about APIs and Microservices and it’s meaning with respect to modern software architectures.

Concluding the conference, I also wrote a blog post, containing my thoughts and impressions of Oracle Open World 2017, which was published on the OPITZ CONSULTING CattleCrew Blog.

As you can, a significant number of contributions have been delivered and, more important than that, tons of information and impressions that I took home form an impressive conference!