Why Automation Matters : The cloud may not be right for you, but you still have to automate!

A few days ago I tweeted this link to an article about some workloads being better suited to on-prem infrastructure.

Jared Still sent me this link.

The executive summary in both cases is, if you have defined workloads that don’t require elastic resource allocation, and you are not making use of cloud-only platforms, you might find it significantly cheaper to run your systems on-prem compared to running it in the cloud.

With reference to the first article Freek D’Hooge responded with this.

“I agree that cloud is not always the best or most cost effective choice, but I find the article lacking in what it really takes to run on-prem equipment.”

I responded to Freek D’Hooge with this.

“Yes. On-prem works well if you have Infrastructure as Code and have automated all the crap, making it feel more like self-service.

For many people, that concept of automation only starts after they move to the cloud though, so they never realise how well on-prem can work…”

I’m assuming these folks who are moving back to on-prem are doing the whole high availability (HA) and disaster recovery (DR) thing properly.

There are many counter arguments, and I don’t want to start a religious war about cloud vs on-prem, but there is one aspect of this discussion that doesn’t seem to be covered here, and that is automation.

But you still have to automate!

Deciding not to go to the cloud, or moving back from the cloud to on-prem, is not an excuse to go back to the bad old days. We have to make sure we are using infrastructure as code, and automating the hell out of everything. I’ve mentioned this before.

Of course, servers in racks are a physical task, but for most things after that we are probably using virtual machines and/or containers, so once we have the physical kit in place we should be able to automate everything else.

Take a look at your stack and you will probably find there are Terraform providers and Ansible modules that work for your on-prem infrastructure, the same as you would expect for your cloud infrastructure. There is no reason not to use infrastructure as code on-prem.

For many people the “step change” nature of moving to the cloud is the thing that allows them to take a step back and learn automation. That’s a pity because they have never seen how well on-prem can work with automation.

Even as I write this I am still in the same situation. I’m currently building Azure Integration Services (AIS) kit in the cloud using Terraform. I have a landing zone where I, as part of the development team, can just build the stuff we need using infrastructure as code. That’s great, but if I want an on-prem VM, I have to raise a request and wait. I’ve automated many aspects of my DBA job, but basic provisioning of kit on-prem is still part of the old world, with all the associated lost time in hand-offs. For those seeking to remain on-prem, this type of thing can’t be allowed to continue.

In summary

It doesn’t matter if you go to the cloud or not, you have to use infrastructure as code and automate things to make everything feel like self-service. I’m not suggesting you need the perfect private cloud solution, but you need to provide developers with self-service solutions and let them get on with doing their job, rather waiting for you.

Check out the rest of the series here.

Cheers

Tim…

Are you running production databases on the cloud? Poll results discussed.

It can be quite difficult to know if your impression of technology usage is skewed. Your opinion is probably going to depend on a number of factors including what you read, who you follow, and the type of company you work for. For this reason I asked some questions on Twitter the other day, just to gauge the response.

Let me start by saying, this is a small sample size, and most of my followers come from the Oracle community, including a number of Oracle staff. This may skew the results compared to other database engines, and technology stacks. I’m commenting on the results as if this were a representative sample, but you can decide if think it is…

So this was the first question I asked.

Is your company running production relational databases in the cloud?

We can see there was a fairly even spread of answers.

  • All prod DBs in cloud: A response of nearly 19% picking this option kind-of surprised me. I speak to a lot of people, and there always seems to be something they’ve got that doesn’t fit well in the cloud for them. Having this many people saying they’ve managed to make everything fit is interesting.
  • Some prod DBs in cloud: I expected this response to be high and with over 27% it was. When we add this to the previous category, we can see that over 46% of companies have got some or all of their production relational databases in the cloud. That’s a lot.
  • Not yet, but planned: At over 24%, when added to the previous categories, it would seem that over 70% of companies see some perceived value in running their databases in the cloud. Making that initial step can be difficult. I would suggest people try with a greenfield project, so they can test the water.
  • Over my dead body: At 29%, this is a lot of people that have no intention of moving their databases to the cloud at this moment in time. We might get some answers about why from the next question.

This was my second question.

What’s stopping you from moving your databases to the cloud?

Once again, we get a fairly even spread of responses.

  • Legal/Compliance: Over 17% of respondents have hit this brick wall. Depending on your industry and your country, cloud may not be an option for you yet. Cloud providers are constantly opening up data centres around the world, but there are still countries and regions that are not well represented. Added to that, some organisations can’t use public cloud. Most cloud providers have special regions for government or defence systems, but they tend to be focused in certain geographical regions. This is a show stopper, until the appropriate services become available, or some hybrid solution becomes acceptable.
  • Company Culture: At over 30%, this is a road block to lots of things. Any sort of technology disruption involves a change in company culture, and that’s one of the hardest things to achieve. It’s very hard to push this message from the bottom up. Ultimately it needs senior management who understand the need for change and *really* want to make that change. I say *really* because I get the feeling most management like to talk the talk, but very few can walk the walk.
  • Cloud is Expensive: At nearly 29%, this is an interesting one. The answer to the question, “is cloud more expensive?”, is yes and no. πŸ™‚ If you are only looking at headline figures for services, then it can seem quite expensive, but the cloud gives us a number of ways to reduce costs. Reserved instances reduce the cost of compute power. Selecting the correct shape and tier of the service can change costs a lot. Spinning down non-production services when they are not used, and down-scaling production services during off-peak hours can save a lot of money, and these are not things that necessarily result in a saving on-prem. I also get the impression many companies don’t work out their total cost of ownership (TCO) properly. They forget that their on-prem kit requires space, power, lighting, cooling, networking, staffing etc. When they check the price of a service on the cloud, it includes all that, but if you don’t take that into consideration, you are not making a fair comparison. Some things will definitely be cheaper on the cloud. Some things, not so much. πŸ™‚
  • Cloud Sucks: At nearly 23%, this is a big chunk of people. It’s hard to know if they have valid reasons for this sentiment or not. Let’s take it on face value and assume they do. If this were a reflection of the whole industry, it’s going to be interesting to see how these people will be won over by the cloud providers.

The comments resulted in a few interesting things. I’ve responded to some of them here.

  • “Lack of cloud skills.” We all have to start somewhere. I would suggest starting with small proof of concept (POC) projects to test the water.
  • “Unreasonable Oracle licencing restrictions.” In case you don’t know, the core factor doesn’t apply to clouds other than Oracle Cloud, which makes Oracle licensing more expensive on non-Oracle clouds. Of course, everything can be negotiated.
  • “Lack of availability of Cloud experts to assist/advise.” I’m sure there are lots of people that claim they would be able to help, but how many with a proven track record is questionable. πŸ™‚
  • “We have a massive legacy estate to consider.” Certainly, not everything is easy to move the the cloud, and the bigger your estate, the more daunting it is. I’m sure most cloud providers would love to help. πŸ™‚
  • “Latency with fat client applications.” I had this conversation myself when discussing moving some of our SQL Server databases to Azure. It can be a problem!
  • “Seasonal businesses with uncertain money flow may not able to meet the deadlines for subscription payments.” Scaling services correctly could help with this. Scale down services during low periods, and scale up during high periods.
  • “The prime fear is being pulled off from the grid. Undependable internet connections.” Sure. Not every place has dependable networking.
  • “Bandwidth requirements & limited customization possibilities.” Ingress and egress costs vary with cloud providers. It may be intelligent design of your processes can reduce the amount of data being pushed outside the cloud provider. The cloud is very customisable, so I’m not sure what the issue is here, but I’m sure there are some things that will be problematic to some people.

Overall I think this was an interesting exercise. Even five years ago I would have expected the responses to skew more in favour of on-prem. Barring some huge change in mindset, I would expect the answers to be even more in favour of cloud in another 5 years.

Regardless of your stance, it seems clear that familiarity with cloud services should be on your radar, if it’s not already. Your current company may not be fans of the cloud, but if you change jobs the cloud may be a high priority for your new company.

Cheers

Tim…

PS. I’ve been running my website on AWS since 2016 . I started to write about some services on AWS and Azure in 2015. I’ve been playing with Oracle Cloud since its inception in 2016 (I think). Despite all this, I consider myself a dabbler, rather than an expert.

Video : Oracle Data Pump 21c and Cloud Object Stores

In today’s video we’ll use Oracle Data Pump 21c to export to, and import from, a cloud object store.

The video is based on this article.

You might find the following articles useful as part of the setup.

The star of today’s video is Markus Michalewicz, the Vice President of Product Management; Database High Availability, Scalability, MAA & Cloud Migration at Oracle. That’s a bit of a mouthful. πŸ™‚

Cheers

Tim…

Video : DBMS_CLOUD : External Tables

In today’s video we’ll demonstrate the functionality of the DBMS_CLOUD package, with specific reference to external tables based on objects in a cloud object store.

The video is based on a section in this article.

You may find these useful.

The star of today’s video is Alex Zaballa, who has got to be one of the most certified people I’ve ever met. πŸ™‚

Cheers

Tim…

Video : DBMS_CLOUD : Objects and Files

In today’s video we’ll demonstrate the functionality of the DBMS_CLOUD package, with specific reference to objects in a cloud object store and files on the database server file system.

The video is based on part of this article.

You might find these useful also.

The star of today’s video is Roel Hartman, who used to do APEX, but now just runs marathons I think… πŸ™‚

Cheers

Tim…

Data Pump Enhancements in Oracle 21c (and a little support story)

I’ve been having a play with some of the Oracle 21c data pump enhancements. This post links to the resulting articles, and includes a little story about one of the features.

The Articles

Here are the articles I wrote during this investigation.

As an aside, I also knocked up a quick overview of the DBMS_CLOUD package. I’ve used many of the routines in this package in my autonomous database articles over the last few years, but I got a bit sick of jumping around to get the syntax of different operations, so it seemed sensible to have a centralised description of everything, along with working examples.

The Story

The article about using expdp and impdp with a cloud object store (Oracle Cloud Object Storage, AWS S3 or Azure Block Storage) came with a little bit of drama.

Back in the 18c days it was possible to import into an Autonomous Database using a dump file on a cloud object store using the 18c impdp utility. I wrote about this at the time (here). At that time export to a cloud object store using the expdp utility was not supported, and the import wasn’t supported with an on-prem database.

Oracle 21c introduced the ability to export from an Autonomous Database to a cloud object store, which worked fine first time. The documentation also listed a new feature called, “Oracle Data Pump Supports Export to and Import From Cloud Object Stores“. This sounded very much like it meant for on-prem databases, and sure enough it did.

When I started trying to use this feature I pretty quickly hit a road block. The expdp utility couldn’t connect to the object store bucket. I raised a call with Oracle Support about it. While I was waiting for a response I figured this functionality may have a dependency on the DBMS_CLOUD package under the hood, so I installed it in my on-prem database. The on-prem installation of DBMS_CLOUD was working OK, but the expdp utility was still failing to contact the object store bucket.

Due in part to friends in high places, my SR got picked up and it was confirmed the DBMS_CLOUD installation was an undocumented prerequisite, but it was still not working for me. The support engineer confirmed they could replicate the issue too. A few interactions between support and development resulted in bug 33323028, which fortunately had a simple workaround. At that point the support engineer was up and running, but I still had a problem. A bit of tracing later and it turned out my remaining issue was PEBCAK (Problem Exists Between Chair And Keyboard)…

When I installed the DBMS_CLOUD package it said to put a wallet reference in the sqlnet.ora file. I did that and the package seemed to be working OK, so I thought everything was good. Unfortunately I put it under the ORACLE_HOME and Oracle 21c uses a read-only Oracle home, so that’s the wrong place. It didn’t affect the package, as that picks up the wallet location from a database property, but it affected the expdp and impdp utilities. I keep telling people read-only Oracle homes will trip you up if you are not used to them, and sure enough it tripped me up. Damn you muscle memory! Once the correct sqlnet.ora file was amended everything was good.

So the journey to get this feature working involved:

  • An undocumented prerequisite, which I guessed.
  • A bug which Oracle Support and the dev folks gave me a workaround to.
  • An idiot (me) trying to learn not to be an idiot.

With a bit of luck the bug fix/workaround will be rolled into a future release update, so you may never see this. The MOS note about the DBMS_CLOUD package installation suggests this might also be part of the database by default in future. That would be great if it happens.

Anyway, after that little drama I was able to export data from my on-prem database to a dump file located on a cloud object store, and import data from a cloud object store into my on-prem database. Happy days!

Thanks to the support and dev folks who helped get me through this! πŸ™‚

By the way, all the other Oracle 21c data pump new features worked without any issues.

So there you have it. Some new articles and a little bit of drama… πŸ™‚

Cheers

Tim…

Oracle Cloud Infrastructure (OCI) and Terraform : First Steps

We’ve got some stuff going on at work using Terraform, or Terrahawks as I like to call it, so I figured it was about time I had a play with it. I probably won’t be doing much of the project work myself, but I like to understand a bit about all the things we do.

The biggest problem with going on one of these “learning missions” is finding something to do that makes it feel real to me. I have some test environments across two Oracle Cloud accounts. One is my free tier account and the other is a trial account I get through the Oracle ACE Program, that has quite a lot of credit. πŸ™‚ I figured I would automate the build of my test environments, so I can trash and rebuild them at will. So with that as my mission, I’ve taken my first steps into Terraform.

I’m not finished yet, and I’m not saying this is production ready “best practice” stuff. It’s just something I’ve been playing around with and it works great. Fortunately the Terraform OCI Provider and resources do all the heavy lifting, and if you are used to using Oracle Cloud, it’s pretty easy to navigate around the documentation, as a lot of it is organised similar to the menu structure. You can find the top-level of the docs here.

As I always say in these situations, it’s early days for me. I’ve got a number of things I want to build, and I’m sure that process will teach me more, and make me look back at these articles and cringe. That’s more rewriting on the way. πŸ™‚

I’m putting this stuff into a GitHub repo, but I’ve not published that yet. I’m still trying to figure out what I should and shouldn’t include. Update. Here’s the GitHub repo.

Cheers

Tim…

PS. If you don’t remember Terrahawks, this might remind you.

Oracle Autonomous Database Cloud 2019 Specialist : My Thoughts

You’ve probably heard that Oracle have made some training and certifications free in recent times (here). We are approaching the end of that period now. Only about 15 more days to go.

Initially I thought I might try and do all the certifications, but other factors got in the way, so I just decided to do one. You can probably guess which one by the title of this post. πŸ™‚

I had seen a few people speaking about their experiences of the training videos, so I thought I would give my opinions. Remember, this is my opinion of the training materials and exam, not my opinion of the cloud services themselves. I am also aware that this was free, so my judgement is going to be different than if I had to pay for it.

The Voices in the Videos

A number of people have been critical about the voices on the training videos. I really didn’t see a problem with them.

When you record videos and do presentations you have to decide who your target audience is. A large number of people that use Oracle have English as a second language. Having spent years presenting around the world I’ve learned you have to slow down a bit, or you lose some of the audience. I do this on my YouTube videos, and it can make them sound a bit monotone at times. When I’ve recorded my videos at my normal talking speed, people have responded to say they were brutally fast. You can’t please everyone. You have to make a choice, and for some professional training materials that probably means speaking slower.

I listened to most of these training videos at 1.5 speed and it was fine. The fact I wanted to listen to it this way is not a criticism of the training. I’ve listened to a number of Pluralsite courses at 1.7 speed, and I tend to listen to non-fiction Audible books on a higher speed. You just have to find what works for you.

It’s just my opinion, but I thought the voice was fine.

Content Inconsistencies

There are inconsistencies between the training materials and the documentation. I originally listed some, but I don’t think it’s really helpful. As with any training material, I think it’s worth going through the training material and documentation at the same time and cross referencing them, as well as trying stuff out if you can. It helps you to learn and it makes sure you really know what you are talking about.

Why are there inconsistencies? I suspect it’s because the cloud services have changed since the training materials were recorded. Remember, there is a quarterly push to the cloud, so every three months things might look or act a little different.

What should you do? I would suggest you learn both the training material, and the reality where the two diverge, but assume the training material is correct for the purpose of the exam, even if you know it to be wrong in reality. This is what I’ve done for all previous certifications, so this is nothing new to me.

How did I prepare?

As mentioned above, I watched the videos at 1.5 speed. For any points that were new to me, or I had suspicions about the accuracy, I checked the docs and my own articles on the subject. I also logged into the ADW and ATP services I’m running on the Free Tier to check some things out.

I did the whole of this preparation on Sunday, but remember I’ve been using ADW and ATP on and off since they were released. If these are new to you, you may want to take a little longer. I attempted to book the exam for Monday morning, but the first date I could get was late Wednesday.

Content

The training content is OK, but it contains things that are not specific to Autonomous Database. Sure, they are features that can be used inside, or alongside ADB, but I would suggest they are not really relevant to this training.

Why? I think it’s padding. Cloud services should be easy to use and intuitive, so in many cases I don’t think they should need training and certification. They should lead you down the right path and warn of impending doom. If the docs are clear and accurate, you can always dig a little deeper there.

This certification is not about being a DBA or developer. It’s about using the ADB services. I don’t think there is that much to know about most cloud services, and what really matters goes far beyond the scope of online training and certifications IMHO. πŸ™‚

Free

The training and certifications are free until the middle of May 2020, which is when the new 2020 syllabus and certifications for some of the content comes out. By passing this free certification you are passing the 2019 certification, and they will stay valid for 18 months, then you will have to re-certify or stop using the title. I guess it’s up to you whether you feel a pressing need to re-certify or not.

Update: Some of the other training and exams are already based on the 2020 syllabus. Thanks for Adrian Png for pointing this out. πŸ™‚

I’m sure this would not be popular at Oracle, but I would suggest they keep the cloud training and certifications free forever. Let’s be honest. Oracle are a bit-player in the cloud market. They need all the help they can get to win hearts and minds. Making the cloud training and certification free forever may help to draw people in. I don’t see this type of material as a revenue stream, but I’m sure some folks at Oracle do.

From what I’ve seen, the training materials are entry level, and something I would encourage people to watch before using the services, so why not make them free? That’s rhetorical. I know the answer. πŸ™‚

Would I pay for it?

No. I watched the material to get a feel for what they included. I’m not saying I already knew everything, because I didn’t, but I knew most of what I wanted to know before using this stuff. Of course, if I had come in clean, this would have been pretty helpful I guess, but I think it would have been just as easy for me to use some of the online docs, blog posts and tutorials to get to grips with things. That’s just my opinion though. Other people may feel differently.

Would I have sat the exam if I had to pay for it? No. I don’t think there is anything here that I wouldn’t expect someone to pick up during their first few hours of working with the service. It’s nice that it’s free, but I’m not sure it makes sense to pay for it.

What about the exam?

The exam just proves you have watched the videos and have paid attention. If someone came into my office and said, “Don’t worry, I’m an Oracle Autonomous Database Cloud 2019 Specialist. Everything is going to be OK!”, I would probably lead them to the door…

I don’t think the exam was so much hard, as confusing at times. There were some questions I think need revision, but maybe I’m wrong. πŸ™‚

What about doing the exam online?

This freaked me out a bit. You have to take photos of yourself at your desk, and photos of the room. Somewhere at Pearson Vue they have photos of my washing hanging up. πŸ™‚ You are told not to touch your face, so as soon as I heard that my whole head started to itch. I started to read the first question out loud, and was told I had to sit in silence. I understand all the precautions, and they are fine. It just felt a bit odd. πŸ™‚

So there you have it. Having promised myself I would never certify again, it turns out I’m a liar… πŸ™‚ If you get a chance, give one of the training courses and exams a go. You’ve got nothing to lose. You can read more here.

Cheers

Tim…

Oracle Cloud : Free Tier and Article Updates

Oracle Cloud Free Tier was announced a couple of months ago at Oracle OpenWorld 2019. It was mentioned in one of my posts at the time (here). So what do you get for your zero dollars?

  • 2 Autonomous Databases : Autonomous Transaction Processing (ATP) and/or Autonomous Data Warehouse (ADW). Each have 1 OCPU and 20G user data storage.
  • 2 virtual machines with 1/8 OCPU and 1 GB memory each.
  • Storage : 2 Block Volumes, 100 GB total. 10 GB Object Storage. 10 GB Archive Storage.
  • Load Balancer : 1 instance, 10 Mbps bandwidth.
  • Some other stuff…

I’ve been using Oracle Cloud for a few years now. Looking back at my articles, the first was written over 4 years ago. Since then I’ve written more as new stuff has come out, including the release of OCI, and the Autonomous Database (ADW and ATP) services. As a result of my history, it was a little hard to get exited about the free tier. Don’t get me wrong, I think it’s a great idea. Part of the battle with any service is to get people using it. Once people get used to it, they can start to see opportunities and it sells itself. The issue for me was I already had access to the Oracle Cloud, so the free tier didn’t bring anything new to the table *for me*. Of course, it’s opened the door for a bunch of other people.

More recently I’ve received a few messages from people using the free tier who have followed my articles to set things up, and I’ve found myself cringing somewhat, as aspects of the articles were very out of date. They still gave you then general flow, but the screen shots were old. The interface has come a long way, which is great, but as a content creator it’s annoying that every three months things get tweaked and your posts are out of date. πŸ™‚ I promised myself some time ago I would stop re-capturing the screen shots, and even put a note in most articles saying things might look a little different, but now seemed a good time to do some spring cleaning.

First things first, I signed up to the free tier with a new account. I didn’t need to, but I thought it would make sense to work within the constraints of the free tier service.

With that done I set about revamping some of my old articles. In most cases it was literally just capturing new screen shots, but there were a few little changes. Here are the articles I’ve revamped as part of this process.

There are some other things I’m probably going to revisit and new things I’m going to add, but for now I feel a little happier about this group of posts. They’ve been nagging at the back of my mind for a while now.

If you haven’t already signed up for a Free Tier account, there is literally nothing to lose here. If you get stuck, the chat support has been pretty good in my experience, and please send feedback to Oracle. The only way services get better is if there is constructive feedback.

Cheers

Tim…

Cloud : Who are the gatekeepers now?

There’s something you might consider sinister lurking in the cloud, and it might cause a big disruption in who are considered the gatekeepers of your company’s services. I’ve mentioned governance in passing before, but maybe it’s time for me to do some thinking out loud to get this straight in my own head.

In the on-prem world the IT departments tend to be the gatekeepers, because they are responsible for provisioning, developing and maintaining the systems. If you want some new infrastructure or a new application, you have to go and ask IT, so it’s pretty easy for them to keep a handle on what is going on and stay in control.

Infrastructure as a Service (IaaS)

The initial move to the cloud didn’t really change this. Most people who proudly proclaimed they had moved to the cloud were using Infrastructure as a Service (IaaS), and were really just using the cloud provider as a basic hosting company. I’ve never really considered this cloud. Yes, you get some flexibility in resource allocation, but it’s pretty much what we’ve always done with hosting companies. It’s just “other people’s servers”. As far as IaaS goes, the gatekeepers are still the same, because you need all/most of the same skills to plan, setup and maintain such systems.

Platform as a Service (PaaS)

When we start talking about Platform as a Service (PaaS), things start to get a little bit trickier. The early days of PaaS weren’t a great deal different to IaaS, as some of the PaaS services weren’t what I would call platforms. They were glorified IaaS, with pre-installed software you had to manage yourself. With the emergence of proper platforms, which automate much of the day-to-day drudgery, things started to shift. A developer could request a database without having to speak to the DBAs, sysadmins, virtualisation and network folks. You can of course question the logic of that, but it’s an option and there is the beginning of a power shift.

When we start talking about IoT and Serverless platforms things change big-time. The chances are the gatekeeper will be the budget holder, since you will be charged on a per request basis, and probably have to set a maximum spend per unit time to keep things under control. Depending on how your company manages departmental budgets, the gatekeeper could be whoever has some spare cash this quarter…

Software as a Service (SaaS)

Software as a Service (SaaS) possibly presents the biggest challenge for traditional on-prem IT departments, as the business can literally go out and pick the product they want, without so much of a thought for what IT think about it. Once they’ve spent the money, they will probably come to IT and expect them to magic up all the data integrations to make things work as expected. Also, once that money has been spent, good luck trying to persuade people they backed the wrong horse. SaaS puts the business users directly in the driving seat.

Conclusion

It would be naive to think any movement to the cloud (IaaS, PaaS or SaaS) could be done independently of an existing IT department, but the tide is turning.

The IT world has changed. The traditional power bases are eroding, and you’ve got to adapt to survive. Every time you say “No”, without offering an alternative solution, you’re helping to make yourself redundant. Every time you say, “We will need to investigate it”, as a delaying tactic, you’re helping to make yourself redundant. Every time you ignore new development and delivery pipelines and platforms, you are sending yourself to an early retirement. I’m not saying jump on every bandwagon, but you need to be aware of them, and why they may or may not be useful to you and your company.

Recently I heard someone utter the phrase, “you’re not the only hotel in town”. I love that, and it should be a wake-up call for any traditional IT departments and clouds deniers.

It’s natural selection baby! Adapt or die!

Cheers

Tim…