Enhancing mobile app user experience through efficient caching in Swift
0. TLDR
We are introducing an effortless solution for adding the capability to instantly load and refresh remote API information on your iOS app screens, without interrupting the user experience.
1. Intro
In today’s fast-paced digital world, user experience is more important than ever. Mobile apps, in particular, need to be responsive and provide a smooth and seamless experience to retain users. One way to achieve this is by implementing a cache for app API requests. In this article, we will explore the benefits of caching and learn how to implement it in order to enhance the performance of our mobile app and provide the best possible user experience.
We will begin by identifying the problem, then proceed to provide a theoretical solution and finally, demonstrate how to implement our approach in Swift
2. UseCase
We will use as example a simple button that, when tapped, retrieves information from a remote API and then updates the app’s user interface with the retrieved data.
In our example scenario, the API call typically takes 2 or more seconds to complete. This can result in a poor user experience as users may perceive the app as unresponsive or slow. To mitigate this, we could implement a some kind loading indicator to inform users that the app is currently retrieving data from the API, but this would increase the time spending developing our app.
Is there a better way?
One way to improve the user experience of our app is by implementing a cache system. By storing the data from previous API requests, we can quickly display this stored data instead of waiting for a new request to complete. This can significantly reduce the time it takes for the app to load data, providing a faster and more responsive experience for the user.
However, it’s important to note that this approach has its limitations. If the data that we are displaying is out of date, it can lead to inaccuracies and confusion for the user.
Is there a better way?
Introducing cache and load. The goal is to provide the user with an immediate display of data while also fetching new data in the background.
By using this approach, the user will experience a faster initial load time, as the cached data can be displayed almost instantly. Additionally, when new data is received, the user interface can be updated again with the most recent information.
In the example bellow, when the button is tapped, but the user interface receives two streams of data. The first event takes only milliseconds to complete, as it displays the cached data, and the second event takes a few seconds as the app retrieves new data from the API. In this way, the user experience is optimized for both speed and accuracy.
This approach (cache and load) will improve the performance of our app, however, implementing this logic for all of our API calls, which can number in the dozens, can be a significant undertaking. It will be time-consuming and provably lead to a lot of repetitive logic and verbose code.
To alleviate this, we will need to find a way to implement caching in a more efficient and streamlined manner that abstracts the caching logic and makes it easy to implement and maintain.
So, how can we do it?
3 . Implementing a generic cache handler
This chapter will be divided into 4 sections due to its more in-depth content:
- Part 3.1— Defining an Enum to express our caching intent:
CacheStrategy
- Part 3.2 — Defining a data structure to aggregate our cached records :
ExpiringCodableObjectWithKey
- Part 3.3 — Define a manager to retrieve and store our cached records
SimpleCacheManagerForCodable
- Part 3.4 — Define a manager that receives any kind of API request, and a CacheStrategy and seamlessly handle all the work for us.
GenericRequestWithCache
The small “price” for the advantages gained is: to select a key, the parameters used, and the expected data type for the service.
3.1 : CacheStrategy
The central concept behind this approach is the use of a cache policy/strategy for all the API calls. This policy allows us use one of four options depending on the use case:
- cacheNoLoad: Use cached data only, without making a new request
- noCacheDoLoad: Skip cached data and force a new request
- cacheElseLoad: Use cached data if available, otherwise make a new request
- cacheAndLoad: My personal favourite, and the one that lead to this article, it retrieves cached data for immediate update of the user interface and at the same time retrieves the latest available data.
With the understanding of the available cache policies, we will now explore how to create a generic data structure (wrapper) for cached records.
3.2 : ExpiringCodableObjectWithKey
The first step in implementing our cache is to select an appropriate data structure (wrapper) to store the data, as well as a mechanism for controlling the lifetime of cached records (we need to ensure that our cache has a limited time to live, as we don’t want to use outdated data indefinitely).
One solution is to use a key-value store dictionary or a cache library that supports TTL. This will allow us to store and retrieve data quickly, while also automatically ignoring expired data from the cache to ensure that what we receive is considered fresh.
Our data structure (wrapper) will consist of three fields: key
, object
, and expireDate
.
- The
key
field will be used to uniquely identify a cached record and will be created by combining the API call name and the parameters used in the call. This way we can differentiate between similar requests like getUserInfo/{userID}, where userID is a variable parameter. For example, if the userID is “007”, the key would be something like “getUserInfo_007”. This ensures that the cache stores distinct data for each unique API call and its parameters, allowing us to retrieve the correct data quickly and efficiently. - The
value
field will store the API response as a cached record and encoded as Data. To achieve this, we can conforme our API response the with Codable protocol, which allows us to encode Data by usingtry? JSONEnconder().encode(apiResponse).
This approach ensures that our cached records are stored in a format that is both efficient and easily retrievable, while also being in compliance with the Codable protocol so that we can retrieved the data and right away is ready to be displayed. - Finally, the
expireDate
field will be used to determine whether a cached record is still valid or if it should be discarded as it’s considered old. This field will hold the Date of when the record should be outdated. By default, if no value is passed, the time to live will be set to 1440 minutes (24 hours) since the time we create the cached record. After this date, which the record will be considered expired and discarded from usage. This ensures that the cache remains fresh and that outdated data is not displayed to the user.
The extension bellow is one, of many ways, to achieve the behaviour we want to get a cached record, while ensuring its not considered outdated.
3.3 : SimpleCacheManagerForCodable
Now that we have defined a data structure for our cached records, we need a way to persistently store them. There are multiple ways to achieve this, but for the sake of simplicity, we will use UserDefaults. This has the advantage of being simple to implement, however, it has a few limitations:
- One is that it’s slower than using a more robust solution like CoreData,
- There is a size limitation for the objects that can be stored.
Therefore, if the size of your API responses is relatively small, UserDefaults can be a suitable option, but if the responses are large, you may want to consider using CoreData which will give you better performance and scalability when searching for your cached records.
Note: Here you can find the same implementation using CoreData
Our cache storage manager can be implemented using various methods, such as UserDefaults, CoreData, or others. To ensure ease of testing and the ability to switch between implementations, it is important that all implementations conform to the same interface or protocol. With this in mind, we will begin by defining a generic interface or protocol:
The protocol is called CodableCacheManagerProtocol, and just defines two functions for working with cached records.
- The first function is
syncStore
which takes in a generic type T that conforms to the Codable protocol, a key, and an array of params. It also takes in an optional timeToLiveMinutes which represents the time for the cache to live. This function is used to store a Codable object in the cache, and this object is our API response. The key and params are used to uniquely identify the record in the cache. - The second function is
syncRetrieve
function which takes in a type of T which conforms to the Codable protocol, a key, and an array of params. It returns a tuple of type (model: T, recordDate: Date)?. This function is used to retrieve a cached record from the cache. . The key and params are used to locate the specific record in the cache. The function returns a tuple containing the model of the cached record and the date the record was stored. If the record is not found or has expired, the function will return nil.
Now, using UserDefaults, we created a class SimpleCacheManagerForCodable which implements the CodableCacheManagerProtocol and uses UserDefaults to store and retrieve cached records.
3.4 : Putting all together
The goal is to make a request (maybe) for data and handling the caching of that data, based on a provided cache policy.
3.4.1 : Cache Policy review
Again, remember, the cache policies are:
ignoringCache
: will make a request for new data and return it, but it will also store the response in cache for future use.cacheElseLoad
: will first check if the data is available in cache, if it's available it will return the cached data, otherwise it will make a request for new data, store it in cache, and return it.cacheAndLoad
: will first check if the data is available in cache, if it's available it will return the cached data, it will also make a request for new data, store it in cache, and return it.cacheDontLoad
: will check if the data is available in cache, if it's available it will return the cached data, otherwise it will return an empty publisher.
3.4.2 : Function signture
Our function takes in
- A Publisher, that will be our API request. This publisher will be executed if we need to fetch data from our API
- The Type of the expected response object. This will be used for us to know, when we read the cached data, what type are we reading
- The cache policy we pretend to use
- A service key and an array of service parameters that we will use to identify our request
- And some class/entity as long that it complies with CobableCacheManagerProtocol, and that we will use to save or retrieve cached information. See Section 3.3 SimpleCacheManagerForCodable
3.4.3 : Inner auxiliar functions
The final function will have 5 main blocks — 4 inner functions and a switch. Each sub functions is specialized on one task only (“fetch for cache”, “fetch from API”…)
Inner func cacheDontLoad()
- Will fetch for CACHED data from our cache manager (SimpleCacheManagerForCodable) and return it.
func cacheDontLoad() -> GenericRequestWithCacheResponse<T1, E1> {
if let storedModel = cacheManager.syncRetrieve(type,
key: serviceKey,
params: serviceParams) {
return Just(storedModel.model).setFailureType(to: E1.self).eraseToAnyPublisher()
} else {
return .empty()
}
}
Inner func noCacheDoLoad() —
Fetch for NEW data, store it for future use, and return it.
func noCacheDoLoad() -> GenericRequestWithCacheResponse<T1, E1> {
return publisher.onErrorComplete(withClosure: { unlock() })
.flatMap({ model -> GenericRequestWithCacheResponse<T1, E1> in
cacheManager.syncStore(model, key: serviceKey, params: serviceParams, timeToLiveMinutes: nil)
if let model = model as? T1 {
return Just(model).setFailureType(to: E1.self).eraseToAnyPublisher()
} else {
return .empty()
}
})
catch({ error -> GenericRequestWithCacheResponse<T1, E1> in
return Fail(error: error).eraseToAnyPublisher()
}).eraseToAnyPublisher()
}
Inner func noCacheDoLoadOrWait()
- Will fetch for NEW data, OR if the same request is already on the way (duplicated request) will just wait for the response and return it.
func noCacheDoLoadOrWait() -> GenericRequestWithCacheResponse<T1, E1> {
switch AvailabilityState.serviceStates[serviceKey]?.value ?? .free {
case .free: return noCacheDoLoad()
case .refreshing: return awaitForCache()
}
}
Inner func waitForCache()
Wait for the current request to complete and then return the cached value, to avoid unnecessary duplicated requests.
func awaitForCache() -> GenericRequestWithCacheResponse<T1, E1> {
return AvailabilityState.serviceStates[serviceKey]!.filter({ $0 == .free })
.flatMap { _ in return
cacheDontLoad()
}.eraseToAnyPublisher()
}
The last block, is switch where, according with the cache policy, we will call each one of the inner functions discussed before.
switch cachePolicy {
case .ignoringCache:
return noCacheDoLoadOrWait()
case .cacheElseLoad:
return cacheDontLoad().catch { _ -> GenericRequestWithCacheResponse<T1, E1> in
return noCacheDoLoadOrWait()
}.eraseToAnyPublisher()
case .cacheAndLoad:
let cacheDontLoad = cacheDontLoad().onErrorComplete().setFailureType(to: E1.self).eraseToAnyPublisher()
return Publishers.Merge(cacheDontLoad, noCacheDoLoadOrWait()).eraseToAnyPublisher()
case .cacheDontLoad:
return cacheDontLoad()
}
And finally, putting all together will look like this:
4 . Usage
Finally, bellow is example of how we add the generic cache behavior to any request.
The first function, is our API request without any kind of cache.
The second function, looks a lot like the first one, and can deal with a cache system. Notice that we just need do define our key, let serviceKey = #function
, define the parameters that our request used `let serviceParam: [String] = [param]
and finally what is the API return type, let apiResponseType = ResponseDto.EmployeeServiceAvailabilty.self
The function should take a cachePolicy as a parameter, allowing the selection of the best option based on the specific use case.
5 — Final notes
In conclusion, caching is a crucial aspect of mobile app development that can greatly enhance the user experience. By implementing efficient caching techniques in Swift, such as using NSCache and URLCache, developers can improve the speed and responsiveness of their apps, reducing the amount of time and data required for users to access their desired content. Additionally, by utilizing caching libraries and frameworks, developers can simplify the caching process and focus on other important aspects of their app. Overall, efficient caching in Swift is a powerful tool that can make a significant impact on the user experience of mobile apps.
The SimpleCacheManagerForCodable (chapter 3.3) can be enhanced in two ways
- Implement it using CoreData for increased performance and unlimited storage capacity. We currently use UserDefaults for simplicity.
- Implement a method for removing outdated cached records from SimpleCacheManagerForCodable to optimize storage usage.
6 — Materials
The full project with the code can be found here