A layered approach to product design

Some time ago I wrote this post.

The Problem With Oracle : If a developer/user can’t do it, it doesn’t exist.

As a result of that post I was invited to speak to a bunch of product managers at Oracle. In that session I spoke about the contents of that post and also about my thoughts regarding a layered approach to product design. I thought I would briefly outline the latter here.

I used to be an Oracle specialist, back in the days when it was possible to have a normal job as an Oracle specialist. Nowadays I find myself doing a really wide range of things. That inevitably means I am Googling my way to solutions a lot of the time. If I had stayed completely in the Oracle world I think my attitude to technology would be very different. There’s something about going outside your comfort zone that helps you gain perspective. In my previous post I said,

“I’m just a generalist that uses a whole bunch of products from a whole bunch of companies, and most of them are too damn hard to use, even with the assistance of Uncle Google.”

In my head, the solution to this issue is for companies to take a layered approach to every aspect of their products. They should be asking themselves.

  1. Who is the audience for this product today?
  2. Who do we expect the audience for this product to be in the future?
  3. Does the product we’ve created meet the needs of these audiences?

Let’s use the Oracle database as an example of this. We could use many other database and non-database products.

  1. There are aspects of the Oracle database that are focused on the developer audience, but there are also aspects of the database that are focused heavily on the administrator audience. So the audience for this product is split.
  2. Over time I expect the DBA role to completely disappear. We have already seen the beginning of this with cloud-based databases. I expect this trend to continue. As a result, I would suggest the future audience of the Oracle database will be solely developers.
  3. Since the product currently has a split audience, and we expect one of those roles to go, I don’t think the database is targeting the correct audience. Anything that is currently a DBA task needs to be automated out of existence. Anything where the DBA is a gatekeeper needs some redesign.

So how do we achieve this? My suggestion would be to take a layered approach. I’ve discussed a similar idea of a layered approach to documentation.

  • Layer 1: Simple how-to.
  • Layer 2: Give me some details, but don’t make my head hurt.
  • Layer 3: I’ve taken some paracetamol. Do your worst!

This time I’m thinking about how the product itself works.

  • Layer 1: The product works out of the box. A developer with no experience of administering this engine can start doing something useful with it immediately. There are no commonly used features that are out of their realm of control. A company or person may choose to stay within this layer for the lifetime of a project and see no ill effects.
  • Layer 2: If the company have some skills in the product, or a specific aspect of the product, they may choose to dip down into this layer. Here they can perform some actions that are not directly covered in layer 1, but this doesn’t mean they lose all the benefits of layer 1. They are not switching off all automations just because they want to deviate from one bit of default functionality.
  • Layer 3: The company have some people with God-mode skills and want to do almost everything by hand. They want to take full control of the system.

The important point is, people want work in the layer(s) they are comfortable in, and still do an effective job. This makes the product accessible to everyone, but doesn’t discriminate against those that want to no-life the product, if they can see a benefit in doing so.

I know there will be objections to this approach, and believe me I can make counter-arguments to it myself, but I don’t see a way forward without taking this sort of approach. I’ll go back to a quote by Jacob Duval where he was comparing MySQL and PostgreSQL.

simply put: mysql has a superior developer experience. postgres has a superior DBA experience. DBA is not really a job anymore, so I pick the developer experience every time.

Jacob Duval

The developer experience has to be the #1 focus from now on!

I’m not underestimating the impact of that statement. It is a massive job for a huge and mature product such as the Oracle database, but not doing so is a death sentence IMHO.

I know some people will see this as a cloud sales pitch, but actually it’s not. I think the on-prem products need to live up to these ideals too. Why? Because I see the future as multi-cloud. If Oracle focus entirely on their cloud offerings, people who decide not to pick Oracle Cloud will be left with a sub-par experience when running Oracle products on other clouds. The result of that is they will pick non-Oracle solutions. I don’t think this is a road Oracle should go down.



PS. I’ve kept this post purposely vague, because I think focussing on individual features will make the post huge, and detract from the overall message…

Update. Akiva Lichtner raised an interesting point. I thought I would add it here, in case someone else is using their own interpretation of what I am suggesting.

“You can see from Tesla’s speed of change that vertical integration which is the opposite of a layered approach is a superior approach. Also reminds me of Bryan Cantrill’s old DTrace talk where he says that the layers conspire to make the whole system impossible to troubleshoot”

Akiva Lichtner

My response.

“It depends what your interpretation of the layers are. Sounds like you are interpreting it as something different to me. The Tesla interface has layers. It can self-drive, or you can drive for yourself. Just like what I’m talking about…”


I’m not suggested we should just keep stacking layer-upon-layer on the existing products. My focus is very much on the potential for a layered audience. As I mentioned above, a Tesla can use autopilot or be driven by a human. Same car. Two modes of operation. We could consider them two audiences.

Vagrant & Docker Builds : APEX 20.2 and other updates

The recent release of APEX 20.2 has triggered a build frenzy.


All my GitHub Vagrant builds that include APEX have been updated to APEX 20.2. The builds themselves are unchanged. This was literally an update to the environment files, so it took longer to test the builds than it did to make the changes.

While I was at it, I did a couple of extra updates. I updated Tomcat to version 9.0.39 on all relevant builds, and updated the optional patch script for the single instance database 19c on OL8 build to use the October 2020 bundle patch. The GI bundle isn’t available yet, so I’ve not altered the OL8 19c RAC build. That will happen soon.

Update: I’ve got the GI bundle patch now, and the OL8 19c RAC build has been updated to use it.

There will of course be more updates to the builds once we get the new versions of AdoptOpenJDK, ORDS and SQLcl, that are probably coming soon.


I mentioned in my VirtualBox 6.1.16 post I would be updating the oraclebase/oracle-7 and oraclebase/oracle-8 vagrant boxes to include the VirtualBox 6.1.16 guest additions. Those are done now.


This is pretty much the same as the Vagrant story.

The relevant GitHub Docker builds for Oracle database and ORDS containers have been updated to include APEX 20.2.

I’ve also added Tomcat 9.0.39 to the ORDS builds, and updated the optional patch script for the database 19c on OL8 build to use the October 2020 bundle patch.

Once again, more changes will appear as the new versions of AdoptOpenJDK, ORDS and SQLcl appear.


Automation is awesome! A few minutes and we are bang up to date!



Video Conference Equipment Breakdown

Buying equipment can be really confusing because the reviews are very inconsistent, with a mix of 5 star and 1 star ratings. Also a lot depends on your experience level. If you are a beginner, simplicity is probably your number one priority. As you get more experienced, people often want greater levels of control (not me). So I thought I would give out a breakdown of what I use at the moment, just in case it helps anyone who is a bit confused.

I’ve given links to Amazon where possible.

Microphone Stuff

I tried several USB microphones before I got to this setup. I think this was the cheapest I’ve used and it seems to be the most consistent for my voice. I’m sure I could get better quality with an XLR mic and a mixer, but I really don’t need the extra hassle. I just plug this in and start.

I use a wind screen and a pop filter, yet I still get plosive noises all the time. I used to use a separate adjustable pop filter, but I was constantly readjusting it, which drove me mad, so I changed to one attached to the mic.

Backdrop (Green Screen)

Every frame and backdrop I looked at has extremely varied reviews. Most of the frames were described as flimsy in the reviews, so I spent more and got something that was really solid, which is both height adjustable and can adjust from 4 feet (121 cm) to 10 feet (305 cm) in width. The green screen is thick, and is so wide I have it folded double when I’m using the 6 feet width I’m currently using.

  • 3.6Mx2.8M Heavy Duty Backdrop Studio Support System Kit Tripod Adjustable W/Bag : (amazon.co.uk)
  • Neewer 6-Pack Set Heavy Duty Muslin Spring Clamps : (amazon.co.uk)
  • Neewer 9 x 13 feet/2.8 x 4 meters Photography Background Photo Video Studio Fabric Backdrop Background Screen (Green) : (amazon.com) (amazon.co.uk)

It’s early days, but I’m really happy with the result.

I only bought the following brick background because there was a delay with the above green screen, but then the green screen came the next day.

  • Allenjoy White Brick Wall with Gray Wooden Floor Photography Background : (amazon.co.uk)


I tried using normal lighting with the green screen and it was OK, but there were green screen artefacts, especially at the creases on the green screen. As a result I decided to get some lights. As soon as these were on and pointing to the green screen, not me, all the artefacts disappeared, even on the heavy creases. These are the lights I got.


I checked out the reviews and this webcam seemed to have a good mix of price and features. There are loads to choose from. You could of course use a proper camera if that is your thing, but it’s far too much work for me, so a webcam is fine.

Nothing to complain about here. The quality is fine. It would be even better if I had some reasonable lighting. πŸ™‚


  • Open Broadcaster Software (OBS) Studio : I use this for green screen with my live conference sessions. You can see my basic instructions here. I could use it to put my face into my YouTube videos, but I hate seeing my face in videos.
  • Camtasia : I use this for all my video production for my YouTube channel. I started life on macOS, but later switched to Windows. It works well on both. It is a lot simpler to use than most of the other video editors I tried.

So that’s it. If you’re involved in the video world you will notice it’s all really basic, and most of all really easy to use. Nothing fancy going on here.



PS. If you only use Zoom or Teams for conferencing, I would suggest using the virtual backgrounds to hide your messy life behind you. You get artefacts on screen, but why would you spend money on all this crap if you can use them for free? πŸ™‚

Open Broadcaster Software (OBS) Studio : First Time Using Green Screen

I’m not sure if you know this, but a lot of people are using video conferencing, and not all video conference tools allow you to use virtual backgrounds. 😱 That’s a bit of a problem if your washing is permanently on display in the background.

I finally took the plunge and ordered a green screen, but how do you go about using it? Fortunately Samuel Nitsche mentioned Open Broadcaster Software (OBS) Studio, so like the obedient sheep that I am, I gave it a go. This post contains some quick notes to get you up and running if you want to try it.

You can download the software from here.

Add Your Camera

  • Open OBS Studio.
  • Click “+” in the “Sources” panel.
  • Pick “Video Capture Device” from the popup menu.
  • Type in the name you want for your device. Using all the powers of my imagination, I decided on “WebCam”.
  • Click the “OK” button.
  • Your face should be on screen now. If you have multiple cameras connected, select your camera in the “Device” dropdown.
  • Click “OK”, and your camera will be in the “Sources” list.

Enable Green Screen (Chroma Key)

  • Right-click on your Source (WebCam) and select the “Filters” option from the popup menu.
  • Click the “+” under the “Effect Filters” panel.
  • Pick the “Chroma Key” option, and on the subsequent dialog click the “OK” button.
  • Select the background colour under the “Key Color Type”. The default is green. It will also allow you to set a specific colour using a colour picker from the background. The default “Green” option worked for me.
  • Adjust the settings until the background is all grey, and you still look clear. The “Similarity” setting seems to make the most difference for me.
  • Once you are happy with the look, click the “Close” button.
  • You should now see yourself on screen with a black background.
  • Position and stretch your image as required. For things like Teams/Zoom meetings, you probably want to make yourself full screen. You can zoom in a little if the edges of your background are showing.

Green Screen Tips:

  • Try to remove big creases from your green screen. Small creases won’t make a big difference.
  • Try to have even lighting on your green screen. Shadows, including any caused by you, will make the green screen less effective. Another tip from Samuel Nitsche.
  • The background images can have a big impact on how good the green screen effect works.
  • Play around until your are happy. It doesn’t have to be perfect!

Add Background Images

  • Click the “+” in the “Sources” panel.
  • Select the “Image” option from the popup menu.
  • Enter a meaningful name for the background image, and click the “OK” button.
  • Click the “Browse” button and pick a background image from your file system.
  • Click the “OK” button.
  • The image will be in the “Sources” list, and it will be higher in the list than your camera.
  • Use the up and down arrows in the source panel to move the image below the camera.
  • You should now be in front of the image of your choice.
  • Add multiple images and use the “eye” icon next to them to switch between them.

Start Using It

  • Click the “Start Virtual Camera” button.
  • In your conference tool (Team, Zoom etc.) select the “OSB Virtual Camera” device as your video input.
  • Everyone will now see your studio quality green screen. πŸ˜‰

As the title says, this is my first time using this, so my suggestions and advice are probably not worth much. πŸ™‚



PS. I’ve posted a quick breakdown of the equipment I’m currently using here.

Upgrades : You have to do them. When are you going to learn? (TLSv1.2)


  • Do you remember when SSLv3 was a thing?
  • Do you remember when everyone disabled SSLv3 on their websites?
  • Do you remember how loads of people running Oracle database version and lower cried because all their database callouts failed?
  • Do you remember how they were all forced to patch to or to get support for TLS?
  • Do you remember thinking, I’ll never let something like that happen again?

I’m so sick of saying this. I know I sound like a broken record, but it’s like I’m living in the movie Groundhog Day.

There is no such thing as standing still in tech. It’s like swimming upstream in a river. It takes work to remain stationary. The minute you stop for a rest you are actually moving backwards. I’m sure your next response is,

“But Tim, if it ain’t broke, don’t fix it!”

The minute you stop patching and upgrading, your application is already broken. Yesterday you had an up-to-date system. Today you don’t. You have stopped, but the world around you continued to move on, and sometimes what they do will have a direct impact on you.

The security folks have been complaining about TLSv1.0 and TLSx1.1 for ages, but we are now in the position where the world and their dog are switching off those protocols, and the “we don’t need no stinking patches or upgrades” brigade are pissing and moaning again.

You knew this was going to happen. You had plenty of warning. It is your fault things are now failing. The bad decisions you made have led you to this point, so stop blaming other people. IT IS YOUR FAULT!

Where do you go from here?

First things first, start planning your patch cycles and upgrade cycles. That isn’t a “one time and done” plan. That is from now until forever. You’ve got to keep your server operating systems and software up to date.

If you can’t cope with that, then move to a cloud service that will patch your shit for you!

I know upgrades aren’t necessarily a quick fix, as they need some planning, so you will need some sticking plasters to get your through the immediate issues. Things to consider are:

  • Your load balancers and/or reverse proxies can hide some of your crap from the outside world. You can support TLSv1.2+ between the client and the reverse proxy, then drop down to a less secure protocol between your reverse proxy and your servers.
  • You can do a similar thing with database callouts to the outside world. Use an internal proxy between you and the external resource. The connection between your proxy and the outside world will speak on TLSv1.2+, but the callout from the database to your proxy will speak using a protocol your database can cope with.

These are not “fixes”. They are crappy sticking-plaster solutions to hide your incompetence. You need to fix your weak infrastructure, but these will buy you some time…

I don’t really care if you think you have a compelling counter argument, because I’m still going to scream “WRONG” at you. If you don’t think patching and upgrades are important, please quit your tech job and go be incompetent somewhere else. Have a nice life and don’t let the door hit you on the ass on your way out!



PS. You know this is going to happen again soon, when the world decides that anything less than TLSv1.3 is evil.

Packer by HashiCorp : Second Steps?

In a previous post I mentioned my first steps with Packer by HashiCorp. This is a brief update to that post.

I’ve created a new box called “oracle-7” for Oracle Linux 7 + UEK. This will track the latest OL7 spin. You can find it on Vagrant Cloud here.

I’ve altered all my OL7 Vagrant builds to use this box now.

You will see a new sub-directory called “ol7” under the “packer” directory. This contains the Packer build for this new image.



Do you know where your installation media is?

This was inspired by a Twitter comment and subsequent DMs, but I’m not going to name names. You know who you are… πŸ™‚

Let me ask you some questions.

  • Do you have an archive of all your OS installation media?
  • Do you have an archive of all your software versions?
  • Do you have an archive of the patches you’ve downloaded over the years, along with any supporting tools?

If you answer “No” to any of those questions, you probably need to rethink your approach to managing your software.


You never know when there will be a catastrophic event and you will need to rebuild something. If you don’t have the exact software, you might not be able to get your system up and running again.

Don’t even get me started on build automation and/or documentation…

But I only use the latest software, so I can download it again!

I’m tempted to scream, “Liar!”

Every company I’ve worked for over the last 25+ years has had a mix of products, including some out of support old crap they try not to talk about. If they say they don’t, they are either a new startup, or they are lying.

But I can contact the vendor and get the media!

Can you? Do you know that? In the past I’ve had to open service requests to get old versions of the Oracle database software, and I’ve never been told no yet, but that’s a big risk to take. There is nothing to stop a vendor from hitting the delete key and making it impossible for you to get a copy of that software from them in future.

This is especially important if you are running old versions of products that are out of support.

When should I purge my archive?

I’m tempted to say never, but lets put a few ground rules in place.

  • A piece of software can only be removed if it is not used in your company anymore.
  • That includes offsite backups that might need the software if a rebuild were needed to allow you to restore/recover the backup. Some places keep old backups for several years, so this could be a long time.
  • For vendors, only when you can 100% guarantee the last of your customers has stopped using that version of the software. 100% guarantee. That probably means never.

Vendors: But we can rebuild that version using our build process!

Shut up. You need to keep all your build artefacts. You can’t guarantee that several years later your build process will be able to build exactly what you need. I know you kid yourself you can, but I think you are probably wrong. Just keep the bloody build artefacts.

What do I do?

I’m sure what I do is not perfect, but it’s pretty good. At work I have all the software we use. For each product version there is a directory containing the base installation media, along with sub-directories for all the patches we’ve downloaded, which includes any supporting tools. In the case of Oracle database software that will include the latest version of OPatch and tools like the PreUpgrade.jar etc.


It is your responsibility to keep hold of all your installation media and patches. If you don’t and a vendor won’t/can’t give you a download, you only have yourself to blame.

I don’t agree with vendors ever deleting old versions of their software, but you have to protect yourself against them potentially doing that.



PS. Don’t ask me to send you old copies of stuff. It is illegal to do that, and I may not have what you are looking for anyway…

PPS. I reserve the right to post a, “I’ve messed up”, post next week when something happens and I don’t have the software. But at least I’ve tried… πŸ™‚

PPPS. Just to prove nothing is ever truly original these days, see Jon Adams’ post on a similar subject here. πŸ™‚

Packer by HashiCorp : First Steps

A few days ago I wrote about some Vagrant Box Drama I was having. Martin Bach replied saying I should build my own Vagrant boxes. I’ve built Vagrant boxes manually before, as shown here.

The manual process is just boring, so I’ve tended to use other people’s Vagrant boxes, like “bento/oracle-8”, but then you are at the mercy of what they decide to include/exclude in their box. Martin replied again saying,

“Actually I thought the same until I finally managed to get around automating the whole lots with Packer and Ansible. Works like a dream now and with minimum effort”

Martin Bach

So that kind-of shamed me into taking a look at Packer. πŸ™‚

I’d seen Packer before, but had not really spent any time playing with it, because I didn’t plan on being in the business of maintaining Vagrant box images. Recent events made me revisit that decision a little.

So over the weekend I spent some time playing with Packer. Packer can build all sorts of images, including Vagrant boxes (VirtualBox, VMware, Hyper-V etc.) and images for Cloud providers such as AWS, Azure and Oracle Cloud. I focused on trying to build a Vagrant box for Oracle Linux 8.2 + UEK, and only for a VirtualBox provider, as that’s what I needed.

The Packer docs are “functional”, but not that useful in my opinion. I got a lot more value from Google and digging around other people’s GitHub builds. As usual, you never find quite what you’re looking for, but there are pieces of interest, and ideas you can play with. I was kind-of hoping I could fork someone else’s repository and go from there, but it didn’t work out that way…

It was surprisingly easy to get something up and running. The biggest issue is time. You are doing a Kickstart installation for each test. Even for minimal installations that takes a while to complete, before you get to the point where you are testing your new “tweak”. If you can muscle your way through the boredom, you quickly get to something kind-of useful.

Eventually I got to something I was happy with and tested a bunch of my Vagrant builds against it, and it all seemed fine, so I then uploaded it to Vagrant Cloud.

I’ve already made some changes and uploaded a new version. πŸ™‚

You will see a couple of older manually built boxes of mine under oraclebase. I’ll probably end up deleting those as they are possibly confusing, and definitely not maintained.

I’ve also altered all my OL8 Vagrant builds to use this box now.

You will also see a new sub-directory called “packer”. I think you can guess what’s in there. If I start to do more with this I may move it to its own repository, but for now this is fine.

I’m not really sure what else I will do with Packer from here. I will probably do an Oracle Linux 7 build, which will be very similar to what I already have. This first image is pretty large, as I’ve not paid much attention to reducing it’s size. I’ve looked at what some other builds do, and I’m not sure I agree with some of the stuff they remove. I’m sure I will alter my opinion on this over time.

I’m making no promises about these boxes. That same way I make no promised about any of my GitHub stuff. It’s stuff I’m playing around with, and I will mostly try to keep it up to date, but I’m not an expert and it’s not my job to maintain this. It’s just something that is useful for me, and if you like it, great. If not, there are lots of other places to look for inspiration. πŸ™‚



Why I don’t want my presentations recorded!

I was on Twitter a couple of days ago and I mentioned my preference not to be recorded when I’m presenting. That sparked a few questions, so I said I would write a blog post about it. Here it is.

This is a bit of a stream of consciousness, so forgive me if I ramble.

The impact on me!

The primary reason I don’t like being recorded is it has a big impact on me.

I’ve said many times, presenting is not natural for me. I’m very nervous about doing it. I have to do a lot of preparation before an event to try to make it look casual, and almost conversational. It takes a rather large toll on me personally, invading every part of my life for weeks before it happens, and pretty much ruining the day(s) immediately before the event. In my head it’s going to be a complete disaster, and the public humiliation is going to be that much worse because I’m an Oracle ACE and Groundbreaker Ambassador, so I must clearly think I’m the shit, yet I can’t even present and don’t have a clue what I’m talking about. Classic impostor syndrome stuff.

That’s “normal” for me and conferences, which is why I nearly always get a post-conference crash, because of the relief it’s over. But it goes into overdrive if I know the session is going to be recorded, because in my head there will be a permanent record of my screw up.

I have been recorded before, but fortunately not on the sessions where I’ve screwed up… Yet… I don’t think… Recently I’ve decided that I will probably pull out of any event where I’m being recorded, as I can’t keep putting myself through that anymore.

There are other people that will happily fill the conference slot, so me not being there is no big deal.

Editorial control

When I write an article, I constantly go back and revisit things. If my opinion changes, I learn something new, or just don’t like the way I explained something I will rewrite it. I have full control of the content.

When I record a YouTube video I edit it, making sure it contains what I want it to contain. YouTube won’t let you do much in the way of editing a video once it’s posted, but you can make minor changes to the timeline. Even so, if something really annoyed me I could delete it, re-edit it and post it again. Yes I would lose all the views and comments, but ultimately I can do that if I want.

When a user group records a presentation, you no longer have any control of that content. If your opinion changes, or it contains some really dumb stuff, it is there for life. I know nothing is lost on the internet, but at least I should be able to control the “current version” of the content.

I very rarely write for other publications. I like to keep control of my content, so I can decide what to do with it. A lot of this is a throw-back to the previous point about my insecurities, but that’s how I feel about it, and why should I have to compromise about my content?

It’s my content!

Following on from the previous point, it is my content. I wrote it. I rehearsed it. I presented it. And most importantly, I wasn’t being paid to present it! Why should a user group now have control of that content?

Karen LΓ³pez (@datachick) recently posted a really interesting tweet.

“What would you think about an organization who held an event and you spoke at it for free. You signed an agreement to allow distribution to attendees, but they are now selling your content as part of a subscription that you are getting no compensation for?”


I’m not saying this is what user groups are planning, but it’s certainly something some might try, now that times are getting harder than usual.

I’m sorry if this sounds really selfish, but I think I’m doing enough for the community and user groups, without giving them additional product to sell. I know a lot of user groups find finance difficult, but in the current online-model, the financial situation is very different. There aren’t any buildings to hire and people to feed.

The audience matters!

My presentation style varies depending on the audience.

If I present in the UK I tend to speak faster and swear a bit. Similar with Australia. When I present in other countries I tend to tone down my language, as some places are really uptight about expletives.

In some countries where English is a second or third language, I slow down a lot and remove some content from the session, because I know there will be a larger number of people who will struggle to keep up. Maybe I’ll miss out a couple of anecdotes, so I can speak more slowly. If there is live translation I have to go a lot slower.

I remember seeing one recording of me presenting with live translation and I sounded really odd, as I was having to present so slowly for the live translation to work. It was kind-of shocking for me to see it back, and I would prefer people not see that version of the talk, as it doesn’t represent me. It’s “adjusted me” to suit the circumstance.

Other things…

OK. Let’s assume other speakers are not self-obsessed control freaks like me for a second…

It’s possible some people would prefer to be selective about what gets recorded. For example, the first time I do a talk I really don’t know how it will turn out. That’s different to the 10th time I give the same talk. For a new talk I doubt I would feel happy about it being recorded, even if I were generally cool with the concept. I may feel better about recording a talk I have done a few times, having had time to adjust and improve it. I think of this like comedians, who go on tour and constantly change their material based on how it works with the audience. At the end of a tour they record their special, only using the best bits. Then it’s time to start preparing for the next tour. I suspect many comedians would be annoyed at being recorded on the first day of a tour. Same idea…

I think recording sessions could be off-putting for new speakers. When you are new to the game there is enough to worry about, without having to think about this too. Maybe other people aren’t as “sensitive” as me, but maybe they are.

I don’t like to be in pictures and videos. It’s just not my thing. I rarely put myself into my videos on YouTube. I’m sure there would be other speakers who would prefer to be judged by what they say, rather than how they look.

I used to be concerned that if someone recorded my session and put it on YouTube, nobody would come to my future sessions on the same subject. I actually don’t think this is a real problem. It seems the audience for blog posts, videos and conferences is still quite different. Yes, there is some crossover, but there is also a large group of people that gravitate to their preferred medium and stick with it.

But what about…

Look, I really do know what the counter arguments to this are.

  • Some people can’t get to your session because of an agenda clash, and they would like to watch it later.
  • This gives the user group members a resource they can look back at to remind themselves what you said.
  • This is a resource for existing user group members who couldn’t make it to the event.
  • For paid events, the attendees are paying money, so they have the right to have access to recordings. (but remember, the speakers are not being paid!)

I know all this and more. I am sorry if people don’t like my view on this. I really am, and I’m happy not to be selected to speak at an event. It really doesn’t bother me. Feel free to pick someone else that fits into your business model. That is fine by me. It really is.


Maybe I’m the only person that feels this way. Maybe other people feel the same, but don’t feel they have a loud enough voice to make a big deal out of it.

At the end of the day, it’s my content and I should have the right to decide if I’m happy about it being recorded or not. I believe conferences should make recording optional, and I’ll opt out. If people believe recording should be mandatory, that’s totally fine. It’s just unlikely I will be involved.

I’m sorry if you don’t like my opinion, but that’s how I feel at this point and it’s my choice. My attitude may change in future. It may not. Either way, it’s still my choice!



Update: This is not because of any recent conferences. Just thought I better add that in case someone thought it was. I’ve been asking events not to record me for a while now and it’s not been drama. In a recent message for a conference later in the year I was asked to explicitly confirm my acceptance of recording and publishing rights, which is why I mentioned it on Twitter, which then prompted the discussion. Sorry to any recent events if you thought you were the catalyst for this. You weren’t. Love you! πŸ™‚

PS. I expected a lot more criticism, and I didn’t expect how many people would respond (through various channels) to say they also don’t like being recorded. It’s nice to know I’m not alone in my paranoia. πŸ™‚

Docker Birmingham March 2020

Last night was Docker Birmingham March 2020. It clashed with the Midlands Microsoft 365 and Azure User Group for the second time, so it was Docker Birmingham’s turn this time. πŸ™‚

These events start with food and I was looking longingly at the pizzas, but I know enough about myself to know it would make me sleepy, so I distanced myself from them until later.

First up was Richard Horridge with “A Brief History of Containers”. As the name suggests this was a history lesson, but it started much further back than most do when discussing this subject. With punched cards in fact. Fortunately I never had the “pleasure” of those, but I did find myself thinking, “Oh yeah, I’ve used that!”, about a bunch of stuff mentioned. That’s it. I’m now part of ancient history. I think it’s good for some of the younger folks to understand about the history of some of this stuff, and the difference in focus from the system administration focus of the past, to the application focus of the present.

Next up was Matt Todd with “Say Yes! To K8s and Docker”. Let me start by saying I like Swarm. It feels almost like a dirty statement these days, but I do. Matt started in pretty much the same way. He gave a quick pros vs. cons between Swarm and Kubernetes, then launched into the main body of the talk, which was trying to find a convenient way to learn about Kubernetes on your laptop without needing to install a separate hypervisor. So basically how to run Kubernetes in Docker. He did a comparison between the following.

He picked K3s as his preferred solution.

Along the way he also mentioned these tools to help visualize what’s going on inside a Kubernetes cluster, which helped him as he was learning.

  • Octant. Kind of like Portainer for Kubernetes.
  • K9s. He described as looking like htop for Kubernetes. 

Of course, the obvious question was, “Why not Minikube?”, and that came down to his preference of not having to install another hypervisor. It was an interesting take on the subject, and mentioning Octant certainly got my attention.

So once again, I noobed my way through another event. Thanks to the speakers for taking their time to come and educate us, and to the sponsor Black Cat Technology Solutions for the venue, food and drinks. See you all soon!