CI/CD best practices: Advice from the 2022 Mobile DevOps Summit speakers

In last year’s Mobile DevOps Summit, we heard from 75+ mobile leaders from companies like The New York Times, Reddit, Meta, HelloFresh, and more about mobile best practices. And this year, we have an even more exciting agenda planned.

Mobile CI/CD helps developers deliver new features quickly, with the help of automation, process optimization, and more. And although setting up a mobile CI is easy, it can be challenging to gain mobile maturity.

What do we mean by mobile maturity? Any team can create an app, but gaining the mantle of a ‘mature mobile app’ comes when your team releases updates regularly, reacts quickly to producing hotfixes, and constantly monitors the success and performance of their app. Check out our latest report which goes into detail on the key metrics of a top-performing, mature mobile app.

In this blog post, we’ll round up the best mobile CI/CD advice from the Mobile DevOps Summit to help you release better apps, faster.

Our agenda for 2023 is now live and we have a host of exciting speakers like Runtastic, Skyscanner, and eBay  – check it out here!

Day 1 (Oct 4th): A multi-track agenda from some of the biggest names in mobile app development

Day 2 (Oct 5th): Hands-on live workshops delivered by industry experts

Everything is online! Attend from anywhere in the world, making this the most accessible and convenient Mobile DevOps event of the year — and it’s free to attend. Register now.

Automate your builds

In mobile app development, it’s a good idea to automate the tedious, manual processes such as creating builds, managing your code signing, uploading screenshots, and submitting to app stores. As you grow and gain a larger user base and code base, you’re going to need processes in place to scale your app development processes. So, the earlier you automate your mobile pipeline, the more time you’ll save down the road. 

“The first thing you should do is automate some stuff… my advice here is: The earlier you do this, the more it’s going to pay back down the road. Definitely make sure that you give your developers time to spend [setting up automation] because if you make your developers spend their time doing boring, manual tasks, you’re going to have some very unhappy developers” — Neil Kimmett, Director of Engineering at ClassPass in the session — Scaling your mobile app release process — at the Mobile DevOps Summit.

Shorten your release cadence

The mobile team at eBay needed to ship features faster. By having over 3 weeks between releases, the mobile team was caught in a vicious cycle of complexities and delays. They were getting a healthy amount of exemption requests to squeeze features in before releases. 

So, across three phases, eBay started its journey to a weekly release cadence. Starting at Phase 1: 3-week release cadence and ending at Phase 3: weekly release cadence. The lifetime of their release branch went from roughly 17 days to 5 days.

“Shipping more frequently means we can react faster. Being perfect is the fool’s errand and let’s instead be quick. By increasing our release cadence, we can fix bugs faster. In fact, shipping more frequently means smaller deltas. That means we are more likely to be able to reason about these changes in complexities so that when we find a bug we might know where that is. And in fact, shipping more frequently forces us to change some bad habits in healthier ways”  — Wyatt Webb, the Director of Native Platform Engineering at eBay, in the session — The Journey to Weekly Releases at eBay — at the Mobile DevOps Summit.

Shift left testing: Test early and often

When the UXMA mobile team first started out, they were running end-to-end UI tests to test the app from the user perspective, while also testing the business logic and communication. With a small app, they were able to run end-to-end UI tests without long testing times. However, as they grew, that became unmanageable.

“Tests need to be part of the CI/CD process… test results should be stable and repeatable. A change in our test strategy was necessary, so we stopped our UI tests — descoped them completely — and focused on unit integration tests… they run much faster and are more stable. Most importantly what they did was it brought back our confidence in our test results.” — Apostolos Giokas, Lead Software Engineer at UXMA in the session — Designing automated tests to scale — at the Mobile DevOps Summit.

They could create more features as they grew.. As a result, the code base grew and their test confidence decreased. So, they pivoted and were able to improve mobile testing using two main actions:

First, “we used clean architecture in order to have better dependency management… clean architecture helped us to keep our units small and easily writable. For our developers, it became much easier to write stable unit tests. Also, the code base became cleaner and [much easier to maintain],” said Apostolos.

Next, “we decided to use feature-based modularization… features are separated into different modules… feature modules — which contained separate features, core modules — which contained business logic, and shared modules for our utilities… The scope of each module was limited and therefore it was very easy to test. Because of the size of the modules, build and test time were also reduced. We were able to build and test multiple modules in parallel, and our overall test time dropped from 45 minutes to under 25. Also, flaky tests were drastically reduced, said Apostolos.

Monitor and analyze your app’s performance

There are two main metrics that GoJek measures to evaluate app performance: App launch time and page performance. 

GoJek measures app launch time because a long launch time will delay a user, which results in a poor customer experience. They measure page load time using the Time to Initial Load (TTIL) metric, which is “the amount of time taken for a screen to render the content and become interactive to the user. It starts when the user navigates to the screen.” — Alfian Losari, the Lead Mobile Engineer at GoJek in the session — Measuring Mobile App Performance at Scale — at the Mobile DevOps Summit.

GoJek also measures page performance to understand how the app is currently performing. They measure slow frames, frozen frames, and image load time. By measuring and improving these metrics, they optimize app performance.

“Slow frame measures the percentage of users that experience a noticeable amount of slow rendering for a specific screen. Specifically, the metric is the percentage of screen instances during which more than 50% of frames took longer than 16 milliseconds to render,” said Alfian.

For mobile client tracing, GoJek uses Firebase Performance SDK’s custom tracing for page performance metrics. They also use random sampling to decide whether they should collect the trace or not for a particular session. Additionally, they automatically capture screen traces and network traces.

Being able to identify negative trends in performance is key to optimizing overall app performance. That’s why it is also advised to monitor build and test times, as long as build and test times can slow down releases and take up valuable developer time. To monitor build and test times, try Bitrise Insights.

For more CI/CD advice, join this year’s Mobile DevOps Summit

Mobile apps behave differently than traditional web apps and can require different skill sets, tools, and best practices. That’s why it’s important to be part of the mobile development community that shares advice, best practices, and more. To jump in, start with the upcoming Mobile DevOps Summit on October 4 and 5th, 2023. Register now.