The flexible consumption model of IT asset finance
While buzz words and phrases fall in and out of fashion, one that has become common parlance among IT finance organisations in recent times is the 'flexible consumption model'. This is "actually quite a useful and accurate description of a modern form of IT asset finance", according to Dave Roberts, EMEA sales vice-president from Dell Financial Services (DFS), the captive IT financing organisation of Dell EMC.
"I think people can get bogged down with the term 'flexible consumption model' and, in reality, it can mean different things to different people," he says. "Yes, it's a tagline, but it accurately describes a type of financing that is tailored to a customer's specific scenario. Three decades ago, if you used the word 'rental' to those responsible for acquiring technology for their company, these people would throw you out of their office. But the same people would be open to 'leasing'. More recently, the term 'rental' has had a revival, and today [the] 'consumption-based finance model' is what everyone is talking about."
A key aspect of a flexible consumption model is billing a client based on asset use or 'pay per use'. This approach, often referred to as 'utility pricing', is not new and began in 1826, when Samuel Clegg invented the gas meter. Over the past ten years, utility pricing has gained traction in the enterprise data centre or server room. The key consumption parameter is storage capacity and the metering system is typically software that remotely monitors capacity fluctuations in real time.
What makes the model flexible is that it can take into account the need for reduced capacity; for example, when storage is not needed for large periods of time, or when a huge chunk of files need to be deleted. So the model is suitable for companies that are growing at an exponential rate and need greater capacity, as well as those with sporadic or seasonal demand spikes. A flexible model is proving popular among those responsible for acquiring equipment. In a recent study by IDC, 63% of respondents said they thought the availability of a flexible consumption model was very important when selecting an IT infrastructure provider.
The IFRS 16 accounting standard was published in January 2016 and has an effective date of January 2019. It requires lessees to recognise nearly all leases on the balance sheet, which will reflect their right to use an asset for a period of time and the associated liability for payments. The question is, what impact will IFRS 16 have on consumption-based models?
"I think that once IFRS 16 comes in, it'll become much clearer for all parties what the acceptable accounting methods are. But my overriding feeling is that it's not going to stop customers," says Roberts. "Customers aren't going to want to pay cash for their equipment. It will be just a way of structuring deals, to make sure they fit within their own accounting policies.
"But certainly, in terms of the consumption models, if you want the key factors, particularly on our flex and utility models, the customer doesn't own the infrastructure, they're paying to use it. Whether that can be determined as a lease or not will depend on how the deal's structured, but I feel that, as an organisation, we're best positioned to make sure that we can tailor the offerings to fit how the customer needs to account for it."
Growing with the client
To meet the flexible consumption demand, DFS has developed a suite of products called OpenScale. For high-growth clients, the product is known as 'pay as you grow'. This is a plan that doesn't penalise clients for exceeding their storage capacity plan, Roberts explains.
"For a 'pay as you grow' arrangement, you could, on day one, acquire all the equipment that's needed for the next three years, but make the payment profiles relate to how it's going to be used," he says. "So in year one, let's say it's the equivalent of only using 50% of the equipment. In year two, 75%, and in year three, up to 100% use. Because it's a payment profile, rather than an asset-based scenario, if you overuse what's predicted, it doesn't matter. So it de-risks a lot of that type of scenario."
For clients with sporadic capacity requirements, Roberts gives the example of a computer game developer that has been able to scale up at the height of its R&D work, and scale down again once the game is released to the market.
"Once the game is finalised by the developer, 70% of that data becomes redundant, so they can then get rid of it, but for the time that they're developing [the game], which could be anywhere between one and 12 months, they need to keep that data and store it, and they need space to do that," he says.
In either scenario, Roberts reinforces the importance of over-provisioning data centres and server rooms with a buffer of technology - because if systems fail and stay down for two or three days due to a lack of storage or computer availability, then the cost to a business is huge.
Hyper-converged and flexible
Many companies are tempted to migrate their in-house data centres to the public cloud, due to the perceived cost and complexity of the assets involved in doing it in-house. But Roberts explains that the latest data centre equipment has become far more integrated, scalable and cost-effective.
"The advent of hyper-converged infrastructure has meant that the key elements of compute, storage and networking are now presented in an integrated unit that is managed and defined by intuitive software, which makes installing and scaling up far quicker," he says.
"For customers looking to achieve the flexibility and operating expense budget of public cloud, but want to keep control of their data on-premise, we also have an industry-leading solution called CloudFlex. Under this model, the customer benefits from Dell EMC's hyper-converged Rail technology under an initial 12-month use contract. At this point, the customer can return the equipment or extend on a fully flexible quarterly basis, and at a price that declines annually. It really is about providing customers with choice. The OpenScale suite of solutions deliver speed, scale and agility."
Demand for data storage
The demand for storage capacity is being driven by the exponential growth in data. According to IDC, global data has grown from 4ZB in 2013 to a projected 44ZB in 2020, and 180ZB by 2025.
The internet of things (IoT) revolution, where sensors collect data from assets ranging from hotel elevators to smart thin-film labels on consumer products such as bottles of whisky, is very much a 'big data' generator. According to Gartner, the number of internet-connected devices is predicted to grow to over 20.0 billion by 2020 from the current rate of around 8.4 billion in 2017.
In addition to the growth of data-guzzling applications such as virtual reality, 4K video and social networks in which higher-resolution videos and photographs are uploaded to sites, the demand for storage is also driven by the need to retain information, especially for regulatory reasons.
CCTV footage and number-plate data, for example, is retained for crime prevention, while email correspondence and transactions in the financial services sector also need to be saved to comply with banking legislation.
'As a service' meets 'pay as you go'
Companies that are leaning towards the public cloud for all their computing and capacity needs may feel that they don't have the personnel to install and manage the equipment in the data centre themselves. Roberts says DFS bundles asset life-cycle management into its agreement, and specialist Dell EMC engineers are available 24 hours a day for support. This combination is proving popular for acquirers of technology, as it provides customers with greater choice and flexibility, at optimal cost. According to IDC's survey, 79% of respondents said that they wanted a bundle of equipment, software, services and maintenance within their future 'pay as you go' needs.