Is Basel IV around the corner?

I recommend reading the discussion paper issued by the Basel Committee in July 2013: The regulatory framework: balancing risk sensitivity, simplicity and comparability. You can download the document at

The paper confirms recommendations I have been giving banks for some time, that they should be separating Economic Equity, used as a management metric, and Regulatory Equity, used as a supervisory imposed constraint. The two are different, their use is different and the models to calculate the equity should or be different. The difference is like management accounting and financial accounting.

For the first time I read that the Basel Committee has accepted the idea that these two are indeed specific and should not necessarily converge. Admitting this changes considerably how banks can respond to supervisory constraints and how the Committee may change its Capital Adequacy recommendations. Actually the Committee has discreetly unveiled some fundamental potential changes that could be excellent news for the vast majority of banks, the banks supervisors and in general all bank stakeholders. Good news for all except for some solution vendors and consulting firms!

But before we discuss these potential changes, there are a few other comments made by the Committee in this paper which need to be underlined.

I will pass on the very good job done by the Committee in justifying their work in Basel I, Basel II, Basel 2.5 and Basel III. Indeed the work done is impressive and the paper is worth reading if only to remember some of the fundamental goals that lead to the development and evolution of the capital adequacy principles and rules. I also appreciated the frank discussion on the key requirements of defining risk based equity requirements based risk sensitivity, simplicity and comparability. We all agree that there are some good risk sensitivity models, but also that they contain certain weaknesses, that they are not simple and that they don’t always allow for ease of comparison. What is not explicitly mentioned but is implicit in the paper, is that the rules and all the comments are written for large internationally active banks (IAB) and the focus of the discussions are around the Global Systemically Important banks (G-SIB). These of course are a minority in the total number of banks and although they deserve the full attention of the regulators and of the Committee, it would be a mistake to disregard the other second tier and smaller banks for which internal models are probably not the appropriate approach at least for compliance. How often have I met bank senior managers and board members that stated they wanted to go advanced IRB as a principle, not for any rational reason?

The resources (financial and human) required to implement internal models is out of the reach of most banks and in many cases would lead to wrong estimations of risks and hence faulty strategies. The reasons for this are many and the controls by the regulators often lacking. The ultimate bad choice is for banks in emerging financial markets to opt for internal models. The tail risks will be huge in these markets, riddled by black swans and lack of market depth and efficiency.
In very diplomatic terms the Committee is not far from admitting this and even suggest that a radical revision of the rules is possible (in the long term) which could include the definition of compliance on the basis of a “Leverage and a Standardised approach” (point 75 bullet 2 of the discussion paper).

“Under such an approach, the regulatory framework would use a leverage ratio and a standardised risk-based approach together, but abandon the use of the internal models approach. This would preserve the “belt and suspenders” approach introduced by Basel III thus limiting regulatory arbitrage and over-reliance on any single model. It would also substantially simplify the regulatory framework, and make the derivation of bank capital ratios more transparent and understandable for all, although ex ante risk sensitivity would again be reduced.”

Personally I see many advantages in such an approach for the whole financial market as the rules could be applied to all banks, on all continents, in all markets. The use of internal models could still be allowed but for limited number of banks such as the G-SIB and a selection of large IAB, but these banks would also disclose under the standard “one size fits all” rule, allowing stakeholders to compare the banks between them and for the small group going internal models to compare that with the standard rules. It should also be made clear through segregation of the disclosed regulatory risk capital the cost (in capital) of being a systemically important bank. Shareholders must know the cost of being TBTF.

Basel IV is around the corner, but how far is the Committee ready to change? Going to the kind of simplification suggested above would probably kill its credibility!
But a major simplification of that nature would meet the constraints of simplicity, comparability and still be risk based.
The responses to the discussion paper are to be sent to the Committee by October 11th 2013. I would suggest that responses not be shy and go the whole way towards simplification. Remember simplification towards standardised rules does not mean “stupid rules” but transparent, understandable and effective risk based rules.
The separation of capital adequacy constraints from management constraints, through independent (but reconcilable) equity models, is an important change in the Committee’s view. It allows for a refined analysis of risk on the basis of the objective of each measure, compliance and management. Management can use simple or more complex internal models if they actually believe that improves their capacity to optimise risk adjusted value and profitability management.
I’m interested in your comments, please tell me what your position is.


Leave a comment

Filed under Compliance Management

Credit Process application

In a previous post I mentioned and described a solution to manage and optimise the credit process for small banks, called ARGOS, a solution developed by a Dutch partner. They have published a brief descriptive document on youtube which I suggest you check out on

If there is any interest to discuss this please don’t hesitate to contact me at

Leave a comment

Filed under workflow and process management

Basel III leverage impact on African Banks

Regulatory Basel III Leverage ratio impact to emerging banks in Africa

The “Revised Basel III leverage ratio framework and disclosure requirements consultative document” was published in June 2013 for comments by 20 September 2013. One can argue about the relevance of such a ratio for bank risk supervision, but that is not my intent in these comments. I only wish to analyse the potential impact of that ratio for emerging banks and markets.
The proposed leverage ratio is defined as the Capital Measure (the numerator) divided by the Exposure Measure (the denominator) expressed as a percentage and calculated on the basis of the average of the 3 month-end ratios of the quarter. The Capital Measure is the Tier 1 capital (as per the BIII standards) and the Exposure Measure is the accounting measure of the bank’s on-balance sheet and off-balance sheet exposures (without netting), net of specific provisions and valuation adjustments (e.g. credit valuation adjustments).
The exposures are defined in detail in the proposed Basel III document, but there are no fundamental problems and all the data should be readily available, with some simple calculations to estimate the exposures on derivatives.
The disclosure requirements are simple and just require transparency by giving the detail of the calculation and amounts as per the template reports included in the Basel proposal.
So we can be reassured that this will not require and extensive systems development and that all banks with a minimum accounting capability and access to financial data will have no problems in calculating the ratio.
The problem for some of the Western banks is not in the calculation but in the respect of the norm. The regulatory leverage is of minimum 3% i.e. the Tier 1 Capital must exceed 3% of the total exposures; in other terms the maximum exposure is limited to 33.33 times the Tier 1 capital. The current limit and definitions were confirmed and are a regulatory requirement tested by the authorities during the parallel run from January 2013 to January 2017. Throughout that period the minimum ratio remains at 3%. The definitions will be finalised in the first half of 2017 and integrated in Pillar 1 as of 1 January 2018.
Leverage analysis in African markets
The impact of the Leverage ratio to the banks in Africa should be limited as they usually have low non-risk adjusted leverage. To illustrate that I used the Zambia financial data of all the banking sector, as published by the Bank of Zambia. As of March 2013 the consolidated leverage was 10.04% versus the minimum 3 %. This is good news as it will allow these banks to manage the much more difficult regulatory limit represented by the “Revised Liquidity Coverage Ratio and liquidity risk monitoring tools” published by the Basel Committee in January 2013.
A quick analysis of all the other African markets confirms this positive news. Remember this is a non-risk adjusted leverage measure. The same is not true when looking at risk adjusted measures of capital adequacy.
Liquidity risk: LCR impact to emerging markets
The calculation of the Liquidity Coverage Ratio under stress is much more complex to calculate. To review all the rules would not be possible in this short analysis. They require readers to refer to the Basel document, readily available on the BIS site (
The actual ratio is simple, as it represents the proportion of High Quality Liquid Assets or HQLA (those that can be sold i.e. transformed into cash, at short notice with imposed stressed haircuts and within markets that have some fundamental depth and price efficiency) versus the a 30 days stressed Net Cash Outflow.
The actual stress hypothesis, haircuts and cash-in and cash-out flow hypothesis represent the difficulty and modelling complexity. It requires access to granular contract data to allow appropriate contract segmentation and application of the stress hypothesis. All banks with adequate data architectures and technology can easily develop the calculation model, with enough drill down and drill through functionalities to report the ratio and more importantly analyse and manage the different components of the HQLA and Net Cash Outflows.
The real problem is not the modelling. The real problem lies in the availability of HQLA and in stability of the banks’ cash flows. In advanced financial markets, banks have had to find profitability in many structured complex products that have low liquidity and have not focused enough on stable funding (over reliance on corporate and interbank short term funding), resulting in large potential discounts of HQLAs and sever potential net cash out-flows.
The Banks and local regulators of the advanced markets have lobbied the Committee to bring the initial LCR limit down from the initial 100%, during a transition implementation period. The Basel Committee has agreed to impose a step-up minimum LCR requirement:
1 January 2015: 60%
1 January 2016; 70%
1 January 2017: 80%
1 January 2018: 90%
1 January 2019: 100%

This is good news for the banking industry as it will avoid sever corrections which would have to be paid by the bank clients and the bank’s shareholders.
The problem for emerging markets and banks is twofold:
1. Insufficient stable long term funding,
2. Imperfect financial markets that do in general do not qualify under the LOC regulations as highly liquid.
The insufficiency of long term stable funding finds its source in structural characteristics of the market but also in the lack of appropriate financial products offerings by banks. Many banks also show weak liquidity management with over reliance on retail or corporate and institutional short term funding. Often these funds are in non-maturity contracts (current accounts and savings accounts). On the other side of the balance sheet, assets are long in liquidity maturity even if they are short in interest rate repricing maturity. Also these assets are mostly totally illiquid. Even the sovereign assets in the investment portfolios will usually not meet the liquidity criterias requirements of the Basel LCR methodologies.
Conscious of this the Basel Committee has issues an “Alternative Liquidity Approach (ALA)” applicable in those markets at the local supervisor’s initiative. But even these rules will not allow many banks to reach the minimum LCR; Assets will not qualify as HQLA and stressed outflows could reach up to 90% of total liabilities.
I tried to estimate the LCR for the banking industry in Zambia on the basis of the consolidated data of the sector, data sourced from the BoZ. The information available was clearly not sufficient but a rough and dirty estimation would at best result in an LCR of less than 10%!
That does not mean that the Zambian banks are miss-managed and the market at the verge of default. No, it just states that the LCR rules (even ALA) are not easily applicable, and for them to be a useful solvency criteria the banks and the authorities will need to work together to allow convergence to international liquidity standards over at least the whole transition period until 2019. In other words the banking authorities and the bank leadership must agree to work towards an environment that would allow compliance to international standards.
The authorities will have to support the creation of a liquid financial market and support market making activities. Banks will have to change their product mix, fundamentally review/ enhance their ALM capabilities both in interest rate management and liquidity management but also on foreign exchange risk management. Banks will have to invest more in sovereign debt both local and abroad creating other risks of which foreign exchange risks. Pricing strategies must integrate risk based remuneration… Bank strategies need to be reassessed on the basis of focused liquidity risk management!
Leverage and LCR
With a leverage of over 10% banks have ample room to increase the size of the balance sheet and arbitrage the limitations of Risk Waited Assets (RWA) compliance ratio. With Regulatory Capital capped at 8% of RWA, banks are de facto limited at a risk based asset leverage of 12.5 times equity. There is a big potential increase of total assets possible when you compare that to the authorised non-risk asset leverage of 33.33 times equity. Thus banks can very easily double their size to constitute liquid investment portfolios. If these are insufficient in the local market, they must seek them abroad (ALA approach) and manage the forex risks. The cost of this strategy is of course an immediate reduction of the bank Return on Assets (the liquid assets have reduced yields as they are Risk Free Investments), Depending on the cost of funding (which will increase) and the resulting Net Interest Rate Margin on liabilities, the RoA could drop by 0.50 to 1.00%. But even so the impact of the excess return over cost of funds, stable operating costs and increased asset leverage would result in an increased of RoE of potentially 2%. These are rule of thumb estimations which need to be refined on a bank to bank basis.
Of course to achieve this, banks will have to develop a very aggressive liability driven strategy to access scarce funding. That strategy needs to focus on all sources of deposits and financing within and outside the local market, in retail corporate and capital markets.
The impact of this from the client’s perspective will probably be an increase of interest rates on loans (to pay for the asset illiquidity) and an increase on deposit remuneration to justify more stable long term funding.
The effect on shareholders will be positive for the banks that apply these strategies because of the potential improvement of the financial efficiency (leverage) and reduced equity risk (liquidity).
Banks must in parallel implement drastic operational efficiency improvements to reduce the risk of underperforming in such a financial environment, while investing more heavily in risk management and treasury management. Will all banks be able to pick up that challenge? Probably not! But the leaders and market makers will dominate the market very quickly.
The regulators should also integrate these strategies within the country’s economic environment as this can dampen short term economic growth by draining liquidity from the market. The impact on shadow banking should also be analysed as this could easily be transformed into a significant competitive advantage to non-bank financial institutions. Accelerating and enhancing revised regulatory constraints on these non-bank financial institutions will certainly be required.
More than the previous Basel II regulation on credit, market and operational risks, the liquidity management will seriously affect the banking sector in emerging markets. Supervisors will have to adapt the local regulations to make them achievable by all banks and avoid brutal shifts in the industry which could have very negative consequences for their global economy.
The leverage ratio compliance is not an issue as it allows banks to adapt to LCR constraints, but as they do the regulators will need to watch with more attention any excess leverage by the more aggressive banks.
Banks, big or small, that recognise these challenges and are ready to adapt to the changing environment, anticipate the changes, will come out on top of the industry. As always the challenge is an opportunity!

Leave a comment

Filed under Compliance Management

Customer valuation in retail banks, Part 1


I mentioned earlier in my series on risk and relationship based pricing that client valuation was an important component of the retail strategy and that I would post my ideas on the subject.
Before writing this, I discussed with a number of banks and checked on the internet the current state of customer valuation policies and models. This very brief analysis confirmed what I believed was the current state of the market, which I can summarise as follows:
1. Client valuation is not a systematic process. This management information is sometimes totally absent, often based on weak models and rarely integrated in a customer centric value based management policy.
2. Models are nearly always based on a current product profit and rarely on risk adjusted value metrics like Life Time Value, although there are industries that are more advanced than others in this approach. The banking industry is definitely lagging!
3. The LTV models used are more often derived for physical product sales and service providers such as telco’s. Consequently they are not adapted to financial products that have very specific characteristics (see Risk and Relationship Based Pricing Part 1).
4. The difficulty of developing valuation models is less in its formulation that in the access to detailed behavioural client data, although this data is developed in many analytical CRM programmes.
5. Customer valuation can only be achieved if Marketing, Finance and Risk management silos are bridged. This is still an exception rather than a standard operating model in big or small banks. The integration is easier in the smaller institutions but they often lack the data… and expertise!

If you don’t work for a financial institution you will find many sources of quality information on the web on this customer value, Customer Life Time Value, Customer Equity… Even Harvard Business School proposes a model with a spreadsheet down load ( ). I also recommend Strategic Planning: What’s the Lifetime Value of Your Customers? by Erica Olsen from Strategic Planning Kit For Dummies, 2nd Edition. Also check the following blog which has a great description and discussion on the subject (

BUT NOTE none of these sites are discussing LTV for banking products. The only one that does refer to banks is Don Peppers ( But I believe the approach developed there has major model weakness because Don Peppers and friends are more marketing oriented than finance focused. That being said read their stuff on managing trust in banking, it is good!

My last comment in this introduction: Bank client value metrics have been developed primarily for retail markets not for wholesale. It goes without saying that the concepts of client value and life time value are also applicable in theory for wholesale and specifically for wholesale banking. Although the principles are similar the usefulness of a precise calculation is different because of the nature of the corporate relationships. What retail banks must develop through an integrated data centric system, the corporate banker will develop on a 1:1 basis on the basis on more specific and complex information. He will be doing this when developing the “relationship plan” and integrate that information in the corporate business model (usually a corporate finance model). This makes it sufficiently different to treat the approach differently to retail banking.

Customer value in Retail banking

When I mention customer value I’m not referring to a vague idea of positive, high, low or negative value. I’m referring to a precise financial calculation of a financial value based on corporate finance theory. At the highest level customer value is the present value of the profit cash flows the bank will generate from the existing portfolio of products sold to a customer, those with a contractual maturity and which generate profits until that time, plus the profit flows of products without contractual maturities, but that will probably be maintained over a foreseeable future, plus future sales of maturity and non-maturity financial products and services.

This is of course different from a simple formula that multiplies an average product profitability per sales, by the projected number of sales per year, minus the costs of sales, the whole being projected over a 1 to say 3 or 5 year horizon with appropriate variables to derive a statistical probability of achieving an “expected” profit of x over that period (with or without present valuing the projected profits). This is great if you want to project the profitability of selling a pre-paid mobile phone card to a customer or a theatre ticket, or a car… But it is inconclusive if you want to measure the profit flow of a 25 year variable or fixed rate mortgage loan or a credit card…

If what I say is true, why are the banks not at the forefront of LTV, and would this be of use in their management challenges?

To illustrate the importance of profit and value metrics, let me run through a case study based on true data and management strategies of a very large North American retail bank (name withheld for obvious reasons, although the information used is publically available in its annual reports and presentations). I will also use the results of a LTV demo calculation based on bank data to illustrate the benefits of LTV.

The case study will complete this first chapter of the Client Value series. In later posts I will describe in more details the model and client centric strategies that it allows.

Customer value case study

The bank initially used a very simple client profitability analysis, which was based on relative product average profitability. The profit margin (in %) was used to multiply the average outstanding of the products (loans, deposits…) and the turnover or usage of financial services used (payment services…). Adding all these profit amounts allowed the bank to have a good idea of the profit contribution of this client for all his product holdings. This also allowed the calculation of the profit contribution by household.

Like many other companies/ industries the bank looked at the profit contribution of each client and segmented them in 10 profitability deciles. Oh my God, guess what, the well-known rule that 80% of the profits were generated by 20% of the clients! So obviously they devised strategies to concentrate marketing and relationship management expenses to the high profit contributors. You can’t argue against this.

client profitability 80-20 rule

They could of course also analyse the relationship between profitability and other client variables such as age. The graph below clearly indicate that older clients have a higher profitability (investment portfolios, high current account balances, etc…), while the 0 to 24 year old where loss leaders

Client profitability segmentation

All very interesting and useful to develop the marketing strategy (and these are two simple examples of the analysis possible.

But… a few years later the bank realised that the profitability data was not correct or at least was oversimplified. The profit was using average revenues and average cost allocations which do not reflect the true nature of both because they are not customer specific. To get close to the reality the bank had to individualise all revenues and costs at a contract granularity. For example, the use of banking services can have different profit contributions depending how and where they are generated. A simplistic example being the cost of a bank transfer generated by internet or through the branch. The same is true for all products; hence the bank must recognise this by going granular. From granular profitability at product level they can then analyse profitability on all dimension, by client, household, branch, region, product, risk rating…

When this bank calculated the new profitability deciles it appeared that up to 75% of the customers moved from their original decile to another one! The consequence is that the bank was concentrating marketing spend to possible the low profit contribution client and vice versa.

After a large investment in technology and systems ( products) the bank set up the new system and developed its marketing strategy. This is what they presented and approved (the actual figures have of course been changed and some of the information hidden nevertheless the table below is good summary of the official strategy of that large retail bank.

Client value Case Study 1

Impressive! See how they allow a very small acquisition of low value customers (you can’t turn them down automatically but you can discourage them to bank with you through uncompetitive pricing, limited product offerings…. At the same time the bank plans to increase acquisition of profitability deciles 1 to 3 and increase the share of the wallet with those clients. Of course the bank will try to increase attritions for deciles 6 to 10.
If this actually happens expect big bonuses for the retail bank leadership.

But… as before are the profitability calculations correct? Is the model appropriate?

I had the opportunity to present to this bank a demo of an alternative customer valuation model. The proposed model was a Life Time Value model based on some real bank data (not the North American Bank).
The first approach uses a profitability model equivalent to the bank discussed. The client segmentation was based on socio-economic characteristics. The results by client segment are indicated in the table below.

Client profitability Case Study 2

Nothing specific to note other than there are major differences of profitability by segment, but that was already known.

Next we tried to estimate profitability for each customer segment through simple multipliers. This showed some major differences but was not acceptable because it was generic estimations based on expert knowledge. Finally we applied the LTV model by separating Current Value, Future Value and Life Time value. Remember the current value is the present value of all profit cash flows (adjusted for risk…) of the current product holding of the customer, while the future value is the present value of the profit cash flows of future sales to the customer. LTV is the sum of both of these. Of course both these projections are adapted to attrition risks, prepayment risks… as well as “sales” propensities and probabilities of roll=overs etc.

The results are I believe interesting. I have highlighted some segments to underline the changes in the customer valuation. Check segments 1, 7 and 9 for example.

cient profitability Case Study 3

Not only does this indicate that some of the very low current profitability clients end up as good or very good customers (they have a high future value) and some customers don’t increase value in the future, they have a low future value. It is of course obvious what these two segments could be: the young university graduate versus the older retired people. If you focus only on those you will of course have serious growth problems and will probably see your share of the market shrink over time. That will have very negative impacts on the bank’s market capitalisation.

When I showed this to the North American banks, they actually confined to me that they had come to the same conclusion and were actively working on an LTV model. The strategy they had proposed was cancelled because of the huge business risks it contained.


Without stating the obvious, think of the impact of a valuation model that can quantify the value of the current and projected product sales, using client behavioural characteristics. Not only will you focus on what is important, total value, but you will also measure/ quantify market behavioural variables to manage and control the efficiency of marketing:
• Focused on budgeted expected total value creation and the variance around that expectation and develop appropriate strategies 9campaigns, pricing…).
• Because the value model is based on quantified variable (financial, risk, client behaviour…) you can focus of the true value drivers and reduce business risks;
• Measure the value of marketing campaigns in a rational way i.e. measure the impact of up-front marketing expenses with the present value of future sales, retention strategies etc…
• Develop adapted pricing strategies to financial and business risks
• Drill down in all the variables that constitute client value and manage the clients on a one-to-one basis.
• Like with and in combination with other marketing models (such as Event Based Marketing) you can substantially increase the efficiency and effectiveness of marketing.

Again this post is too long… sorry!

If you have comments, ideas and if you disagree please don’t hesitate to comment on the blog or send me an email at

The following chapters will describe in a little more detail LTV as I used it in the case study.

Leave a comment

Filed under Customer valuation

Credit Process Flow Management 3

Process and Policy definitions

The core principle of the ARGOS solution is the optimization of analytical and operational process. Any implementation starts with a process flow definition as defined by the bank. We do not impose any process flow; we use the bank’s defined process! Nevertheless we know that more often than not banks will decide to reengineer their process when confronted with a clear description of them, often highlighting process flaws.
We will of course collaborate with the bank to define the process it wants to manage. This consulting work is done before the implementation of the solution and can be limited to the description of the current process or their reengineering. The process definition will be documented through a standard process flow description applications readily available in the market, so that the bank may maintain the process description in line with their evolution in time.

To illustrate the credit process, consider the following simplified credit process:

credit UW process

Step 1: Client contacts bank and request is registered
Step 2: Account manager conducts interview registering details in Excel template
Step 3: Excel data are submitted to remote server
Step 4: Remote server reports credit risk assessment and financial analysis
Step 5: Decision account manager
Step 6: Decision credit manager
Step 7: Sending out letter to client
Step 8: Recording in database

All the documents referred to in the process flow are of course bank specific templates. Unless requested to change/ improve the current bank documents, ARGOS uses the existing templates. ARGOS makes use of the dynamic forms technology through which content filled out in any of the forms is stored automatically in the database. All of the data can be retrieved and posted in automatically generated documents such as proposal letters and reports. The interview will thus be based on the standards questions required by the bank to analyse the solvency of the prospect client.

The process can be completed by the New Business Officer, the Analyst and the Manager anywhere, anytime as long as he can be connected to the bank information system or the internet. The process implementation is simple and requires standard inputs in template electronic documents.

The procedures (and process) will be compliant to the Banks’ Credit Procedure Manual. If such a reference does not exist, our consultants can support its development.

management steps UW process

As for procedures, the Bank will have Credit Policies that define all the aspects of credit risk management. These policies include the rules applicable for the type of loan products and client characteristics, as well as the rules for the approval authority of credit risk, the credit analysis models and methodologies, the risk pricing rules and of course the reporting rules (internal management reporting as well as compliance reporting and other external communications).

If these policies need to be reviewed, adapted, enhanced for management purposes or for regulatory purposes, we can support that effort through a parallel consulting effort. The system must also be flexible so that policy changes can be easily be implemented. For example the credit approval matrix may change (amounts by function…). The changes are integrated in the credit rules of ARGOS quickly and easily. For example risk authority delegations or other rules. The policies will of course define acceptable credit analytical models and scoring/rating rules.

Business Drivers for an integrated credit underwriting process like ARGOS

The business advantages of using a platform such as Argos have been mentioned previously. They can be summarized as follows.

UW process management benefits

This risk management platform is not a credit risk capital adequacy management platform; it is a business management platform for credit risk, which is compliant to the 2nd pillar of Basel 2. The scoring/rating analytical models are optimized for markets with limited current and historical date of the quality and depth required by the Basel Accord. It is recommended that these emerging banks use the standardized approach to calculate their Capital requirements. But the reality of the matter is that both these approaches are reconcilable and complementary.


The credit underwriting process flow management is a system that allows smaller banks to streamline and integrate all the aspects of the stages of credit decisionning.

Those that are interested to know more can email me ( to request a complete description of the approach and product which includes descriptions of:
1. The scoring engines,
2. The rating approach,
3. The collateral module,
4. The reporting and document generation module,
5. The audit trail and control functions.

Also included are descriptions of the Business Architecture and the Technical Architecture.
Let me know if you wish to receive this. I will only respond to request that include the name of the institution and the full name, function and contact detail of the person requesting this document.

Thanks and talk to you soon.

Leave a comment

Filed under Credit Underwriting Process Management 3, Uncategorized

Credit Process Flow Management 2

The Benefits of a structured approach
The ARGOS Credit Process Management is a modular system that supports bank process and analytics. It focuses initially to the lending services of banks for Retail Clients and SMEs, but also for wholesale customers.
What are the business benefits of implementing a robust process flow management application?

Business benefits:

1. Enhanced competitiveness and service differentiation:

a) Improved responsiveness of the bank to credit applications through a reduction of the application response time. Linked to an appropriate credit scoring/rating system the bank response can for certain products and activities be quasi immediate. In other cases the improved process flow will reduce bottlenecks and time between two processes.

b) Optimized use of a variety of distribution channels. This is particularly important for banks having large branch networks in large countries with limited access to remote areas.

c) One-to-one loan pricing to enhanced the client satisfaction within strict compliance of credit policies (see below).

2. Enhance credit management capabilities:

a) Implementation of a one-to-one risk based pricing on all products. By linking the credit approval analytics with pricing analytics in an integrated credit underwriting process the bank can individualize the credit pricing on a risk and relationship basis.

b) Strict compliance to the Bank’s Credit Policies and Procedures. The process flow will be managed in accordance with the bank’s approved procedures and with the banks policies (approvals…). The integration of the credit scoring/rating applications will allow a systematic compliance to the credit analytical models as specified in the Banks Policies.

c) The credit risk data generated by the systems (process and analytics) is focused on credit management and may be different than the capital adequacy models used by the bank, but they are reconcilable with the Basel II, Pillar 1 models .

d) The management of credit processes is an important step to compliance to Basel 2, Pillar 2 Internal Capital Assessment Process (ICCAP)
Operational benefits:

1. Enhanced operation efficiency: All the credit underwriting process are defined in a process flow applicable to all application and only differentiated when needed by market/client type.

2. The automation of the process flow results in improved control and reduced operational risk:

a) All processes can be tracked and controlled, allowing a better management of the operating capacity (bottlenecks, exceptions, resource allocations – – human and technical resources…).

b) An audit trail is generated for all transactions handled by the system.

3. The web based client server architecture allows the system to be implemented throughout the bank’s distribution network (branches, specialized departments, internet, mobile banking, third party sales partners…), with all the required controls and audit trails.

The next post will describe the process management requirements. If this looks like a sales brochure… guess what! Sorry for that but we have to earn a living.

Leave a comment

Filed under Credit Underwriting process management 2: Benefits, workflow and process management

Workflow management for the credit underwriting activities of retail and wholesale banks

BC&T has partnered with Stachanov BV ( to bring to the market a process flow management solution tailored to the smaller banks. I will introduce this application (ARGOS) in this blog.

Interested institutions should contact me on


The banking industry is faced with fundamental challenges that will impact their activities, business model, process and organisation. The needs for these changes resulted from the great Banking crisis that started in 2008.  The consequences of the crisis were regulatory, but also the fundamental equilibrium of the industry was impacted, of which the most fundamental was certainly the discontinuation of the “trustful” relationship between bank and client. This has resulted in increased investments in regulatory compliance and an increasing risk of the bank being disintermediate and hence in an increasingly competitive environment.

The challenges of the shareholders and senior management are to implement business strategies that respond to contradictory constraints:

  1. Comply with a growing set of regulations for all risk classes, including operational risk, solvency and business risks. Compliance is more extensive than capital adequacy as it includes minimum risk assessment procedures and processes, model risk etc. Management must seek to leverage these investments to create value!
  2. Optimise operational efficiency. In its simplest form this means generating more risk adjusted revenues with a given level of operating expenses. Without doubt this implies process improvements and enhanced operational policies.
  3. Optimise operational effectiveness. Not only does the bank need to be financially efficient, its process must also be effective. By this we mean that the operations of the bank must lead to the strengthening of the service and product delivery to enhance client satisfaction, recreate client trust and hence reduce the business risks.
  4. Optimise financial efficiency. The banks are faced with scarcity of financial resources (equity and borrowed capital). They must allocate these resources effectively to generate the required return on risk. This requires that all the previously mentioned challenges be integrated into a business model that is adapted to the bank’s market and clients and manages all the component of this model in an integrated way.

Many banks do not have the size and means to invest huge amounts of money to buy solutions developed for the larger banks in more advanced financial markets. ARGOS was developed to respond to the challenges by bringing simple but effective solutions to smaller banks to

  1. optimise their process
  2. optimise their risk underwriting and management.

We have started with the most important activity of many retail and wholesale banks which is the credit underwriting process and the credit scoring and rating models. We have extended the development vision to include collateral valuation and through the calculation of expected losses generate the management information for credit risk provisioning and risk based pricing. The proposed solution is not invasive as it will use the rules, policies and procedures of the bank. If the bank’s management wishes to review these rules, policies and procedures to adapt them to a more effective and efficient business model and implement enhanced risk models, we can help in the development of these and of course implement the new rules in ARGOS.

With ARGOS, bank will have an automated credit underwriting process from origination/ credit application to disbursement. The process flow will respect the ban’s defined credit policies with structured process. This approach is Basel II, pillar 2 ICAAP compliant and will furthermore enhance the risk adjusted profitability and financial efficiency of the Bank.

The solution is geared towards the development of an integrated risk management system for emerging banks. It can be applied in retail banks, for SME banking or for wholesale banking. The priority is to develop a pragmatic and efficient solution that can be implemented easily for a low “cost of ownership”.

In the next blog we will discuss some of the benefits of the system proposed.

Don’t hesitate to comment on the content of this new series of bank management discussion. If you want to remain informed of posts as they are published don’t forget to follow this blog or to request email advises.

Leave a comment

Filed under Introduction to workflow management, workflow and process management

e-book on Risk and Relationship based Pricing Models and Strategy

This is a last posting on the pricing model and strategy.

I have transformed the pricing posts into an e-book, titled Risk and Relationship based Pricing. The e-book  is based on the posts of the blog, reorganising them, correcting many small and bigger mistakes, spellings and typos!

You can get this e-book (100% free) by sending me an email request on

Please add your occupation, employer, and if you are interested in receiving offers for training or additional information on this subject. I may restrict the transfer of the e-book if the request is not legitimate.

Thank you for your continued support.


Leave a comment

Filed under Risk & Relationship Based Pricing

Risk & Relationship based Pricing Part 5

This continues and completes the description of good risk and relationship based pricing principles.
Principle 6: Define cost allocation strategy and models
As a general principle all customers should pay the costs associated with the product they buy. The questions are:
1. What costs should be allocated?
2. With what level of granularity should these costs be allocated?
3. How are you estimating these costs?

In the ideal world the cost allocations should be done are the most granular level possible. This means, as an example, that the ATM costs should be allocated to each customers on the basis of the ATM usage by the customer and all ATM may have a different cost structure on the basis of location and other variables. This requires a very big and complex cost allocation system that can individualise all costs (and revenues) at client/ contract level. Teradata for example has such an application, but be warned it’s a big project to implement!
But is this good enough? Not if you just allocate past costs, as we are looking at future cash flows of revenues and costs! Those future costs will vary in time due to inflation, capacity, operational efficiency etc. This means you have to estimate future volatile costs based on historical allocations. Not impossible but definitely not easy.
What could be the consequences of allocating future costs as they are expected to occur, while revenues are dependent on the contract characteristics? Take a mortgage with fixed monthly instalments. The revenues of such a contract will be very high during the first years then very quickly drop to become small in the following years, while operating expenses for this contract will remain stable and even increase through inflation. The net profitability schedule will show higher profits followed by many years of losses! Actually this is the simple consequence of accrual accounting and should be adapted in management accounting.
Some banks want to allocate all costs down to the taxes associated with the product. That is actually easier to do, but is it correct? Should the customer pay the bank’s corporate income tax, or the Chairman and CEO’s airplane? Should the customer pay the cost of the Equity allocated to the products he buy’s.
In my opinion the answer is definitely no. The principle is that customers must pay the expected losses and the expected operational expenses (direct and indirect), but that the shareholders should pay unexpected risks and indirect costs, at least the indirect costs generated because of the nature of the corporate activity (being a bank) and thus including regulatory expenses, general management, audit and control costs… Yes these should be paid by the shareholders through reduced net profits (RoE) and dividends. An all-in cost invoiced to the customers would have them pay the inefficiencies of management and would make the banking products unaffordable.
Also core is the cost quantification model used. What cost accounting models should be applied? Average costing by product, by process or activity based costing…?
Decisions on the cost allocation strategy will fundamentally change product pricing and profitability. What is appropriate for your bank, your market, your global strategy? How is your bank analysing this/ Life Time Value of the contract, or if you prefer the Fair value of the contract after all costs allocations appears to me as the only financially correct way of giving a value to that contract/ client.

Principle 7: Define, quantify and manage current and future client profit and value contribution
It is evident that bank profitability and shareholder value requires appropriate pricing. But excess pricing can destruct value by killing growth and client satisfaction. To avoid that pricing must be “efficient” i.e. be fixed on the basis of risks and market/ client specific demand and price elasticity.
We believe this means that client pricing but consequently the client’s profitability analysis, must be granular and forward looking. Average profitability of products, multiplied by the contract size, which is added to the profitability also calculated on an average basis, will only give you an average historical profitability of little use to manage your client in the future.
Is your profitability analytics compatible with modern value metrics, such as the Customer Fair Value, which we can be summarised as the fair value of the client current product holding (multiplied by client behavioural variables) plus fair value of future product holdings (multiplied by the related client behavioural variables).

risk adjusted current and future client value

risk adjusted current and future client value

Such a model will highlight and individualise the behavioural sales/ marketing risks and allow the management of business uncertainties. LTV is metric to integrate and manage all client value drivers: profitability, growth, risks and time.
We will discuss customer value in détail in a future series on this blog.

Principle 8: Manage competitive pricing
How should the bank integrate pricing information from the market, from competition?
What should the bank do when it models a theoretical risk and relationship based price (for a product sold to a specific prospect) which differs from the market prices?

  • Reduce its price to meet competition?
  • Increase its price when it is lower than the market price?
  • Keep price unchanged?

A simple question but a very complex answer, requiring a multi-dimensional analysis!

  •  How to handle market inefficiencies and bank specific inefficiencies?
  • How do you integrate product life cycle and relationship life cycle?

These multidimensional analysis should allow you to understand pricing inefficiencies and market/ pricing opportunities, but don’t make the client pay for your inefficiencies and always manage pricing in a transparent and open way with the prospect client.
Remember abnormally high spreads can lead to immediate profits but negative values because of increased attrition. You need a long term value measure to manage your client over a long term horizon, and the at least the life time cycle of the product holdings!
Strategically you may want to sell a product at breakeven or at loss because you are buying market share or you have bundled that product with high profit/ value products. In all such cases you must calculate the loss and/or negative value generated for allocation as a cost to sales/ marketing.

Principle 9: Manage the complexity and risks of your quant models.
Bank management is complex because cash flows are volatile and behavioural. Managing that complexity requires deep understanding of the market, the client, and the product value drivers.
Each bank must adapt the complexities of its business models to its needs and markets. It should use quant models that are useful for its specific strategies and market constraints (often restricted due to weak data quality and availability).
From a practical perspective, the bank’s sales force does not need to understand all these complexities. You must package products in a way that can optimise the sales unit efficiencies.
Likewise clients do not need to understand all the complexities of the product. Managing risk and operational intermediation risks is what bank should do. They must package the products to make them attractive but without hiding any financial risks that you could be transferring to the product buyer. If you sell a variable rate mortgage inform the customer that there is risk in such a product for him (the bank has passed the financial risk to the customer because it will not or cannot manage the interest rate…). Finally be transparent in pricing, client will understand and appreciate this honest and professional attitude.
Ensure strong communication between your sales force and the clients, based on a simple list of core variables such as:

  • Price is dependent on risk and collateral,
  • Standard, off-the-shelf transactions are cheaper than tailor made products,
  • Pricing is a function of operational process, automated process (internet banking) will be cheaper than 1:1 branch operational support.

Principle 10: Adapt the organisation to the requirement of risk and relationship based pricing
A close integration of all risk management processes and policies with other parts of the process flows, will generate operational and management synergies. In other words banks should consolidate management analytics and resource allocations and break operational silos. This has often been said but not often realised! The evolution of data management and management analytics can achieve what enterprise culture has often shied away from!
Consolidation of management analytics and resource allocations will include Enterprise wide Risk Management (ERM) capabilities, with often new and specialised functional responsibilities (Risk Transfer Pricing, Economic Capital budgeting and allocation, Risk policy recommendations and strategy implementation…). On that subject please don’t look at ERM as an organisational issue (bring all risk functions under one umbrella headed by the CRO. ERM is much more that that!
Breaking operational silos is ensuring that the process flows are managed from client product request to product delivery in the most efficient and effective way, and is focused on client satisfaction and the bank’s performance criterias (KPIs) including operational capacity usage and cost efficiency, Economic Equity optimisation etc.
I recommend that such an integration project completes a thorough “impact analysis” on all aspects of the business model: Process, Organisation, Data, Applications and Technology, or PODAT analysis.

Principle 11: Integrate pricing process within an efficient process flow management system, from the origination of the client request to its fulfilment
Pricing is just one element in a chain of processes from origination to product delivery.
In the case of a loan these will include:

  1.  application analysis and qualification,
  2.  product matching to client requirements,
  3.  credit analysis – – scoring & rating,
  4. credit structuring & pricing,
  5. credit approval,
  6. documentation development and signature,
  7. verification of condition precedent,
  8. disbursement(s),
  9. contract management (interest payments, interest rate fixings…),
  10. collection.

Any delay or mismanagement of any of these activities (themselves usually divided in multiple operational, analytical and management process) will have negative impacts on the bank’s value drivers!
Control and automation of these activities and process are essential to sustain efficiency and strong business development.
The activities and process are data driven, rule driven and model driven. Without clear Policies and Procedures the bank cannot achieve operational efficiencies and respond to international standards of good management. The bank must manage the related operational risks and have strong audit trails on all activities.
Regulatory constraints and capital adequacy rules are dependant of these activities and must be fully integrated.

Principle 12: Integrate risk underwriting and pricing in an integrated sales and marketing strategy
Efficient process, good pricing models and strategies can result in very bad client management if behaviour analytics are weak and below standard.
We will define the behavioural analytical models as:

  • Credit scoring models that allow appropriate credit risk rating (for risk management purposes and/ or not for regulatory purposes);
  • Collection scoring for appropriate collections management;
  • Marketing scoring models that allow the development of appropriate client information (intelligence) such as sales propensities, attrition risks etc.
  • The client score card for credit risk and product ownership (business) risks is a core input in financial risk management (interest rate, liquidity) and capital management.

Risk appetite and risk allocation will be RAROC (or RORAC) dependant and global strategy must manage the Earning at Risk (EaR), due to all the risks underwritten by the bank.  An Adjusted Return on Risk Adjusted capital (ARORAC) is an interesting approach of integrating the bank’s solvency in the equation. We will define ARORAC as the RORAC minus the Risk Free Rate (RFR), divided by the banks’ market beta.

Principle 13: Optimise risk appetite, risk budgeting and allocation and pricing towards clear and precise performance criterias
The 2008 – 2012 great banking crisis has imposed stricter internal risk management constraints not only in capital adequacy but also and possibly more importantly in the bank risk policies, strategies, models and process of risk management.
Consequently this puts new challenges in the Financial Control function of the bank and in its management accounting policies and procedures.
In parallel financial accounting (IFRS) is implemented and includes different approaches to the risk management concepts defined under the Basel Accord and Best Risk Management Practices(BIS).
All the approaches are valid, but different hence they need to be reconciled!
Shareholder Capital is at a premium. It is rare, expensive and insufficient to cover all the needs of the market. Strict Economic and Regulatory Equity management is an absolute requirement. This translates into new solvency management requirements.
Banks develop new Key Performance Indicators (KPIs) to meet these challenges. They must be integrated into pricing to generate value manage value drivers.
This can only be achieved if pricing is relationship based and risk based.

This concludes the series on risk and relationship based pricing. I realise that this helicopter view of the “best of class” approach can seem excessively complex for many banks and markets. High competition, highly volatile markets, the increasing complexity of products sold and the increasing weight of regulatory constraints, will force all banks to enhance pricing models and strategies.
I hope my view of the subject will generate comments and discussions.
Please don’t hesitate to add yours on the blog.
Next series will cover customer value metrics.
I will then start a new series regarding a new approach in credit risk management using available but rarely used data. I’m talking of Event Based Credit Management, which I am developing with a good friend and partner, Mark Holtom. Mark is founder and general manager of eventricity an expert in Event Driven marketing. I strongly recommend you check him out on

Leave a comment

Filed under Pricing Part 5

Risk & Relationship based Pricing Part 4

I believe we have shown in the previous blog postings, that pricing can be complex as it integrates many management aspects. That does not mean the pricing strategy and model used by a bank needs to be complex and based on expensive pricing applications. The pricing model must be adapted to the bank’s business strategy and environment; it should also be based on core principles of good pricing.

In our previous blog we mentioned other concepts that we will review later, such as customer profitability and Life Time Value. We will do so in a separate series that should start early May. Stay tuned and get an email advice by following the blog.

Before we start the review of good pricing principles, I need to discuss a number of requests received from readers of our last post: Can I send them a copy of the models, formulas used in the pricing exercise in post #4 ?

It is difficult to give a complete answer to the request because there are many sub-models in the pricing model. The inputs are also dependant on many additional models used by different units of a bank. Let me give you a few examples to illustrate this.

  1. Cost of Funds. The CoF is based on matched Modified Duration & Convexity or VaR, depending on the market and capabilities of the bank. That also includes all implicit optionalities such as delayed drawdowns, pre-payments… In some instances this includes specific market calculated liquidity premiums; in others best estimations of those premiums using a more deterministic model.
  2. Credit risk is based on PD, EAD, LGD. Each of these is dependent on multiple different models, including different scoring and valuation models…
  3. Customer Value can be a simple estimation of profitability or a more complex Life Time Value (LTV) calculation.
  4. Economic Equity. There are many different approaches of calculating Economic Equity, which as you know goes beyond Basel II and/ or Basel III. They also have different quantification models…
  5. Risk diversification/ concentration uses classic portfolio theory approach. The difficulty is in defining the future/ projected variance and covariance. To estimate these you can use many different models (Bayesian approach or simply historical statistical models…)
  6. Cost allocation. Do you use ABC, PBC… or other models and which costs do you allocate (fixed and variable costs? direct and indirect costs? Cost of Equity?…)
  7. Etc.

The appropriate pricing model is bank specific, as it is dependent on the strategy, activities, markets and corporate vision. It can be based on very complex or very simple models but always needs a full understanding of the variables or value drivers. The bank must control of the calculation models (no black boxes). The calculations I used in this blog are highly simplified models that are just good to illustrate the issues I want to highlight.

We recommend you start with a “pricing policy and process assessment” of your bank’s current state, develop a future state vision (what you want to achieve in line with the bank’s strategy and banking model) and develop a project to bridge the gap between the current state and where you want to go.

My company can help you in such a project. Send me a message at for any additional information required.

Strategic principles of good pricing:

At this stage I count 13 core principles for good pricing. I’m sure we can add a few but these would be specific to an activity, market, bank, and would probably relate to different sales strategies. I want to focus on the generic principles relating to the pricing of financial products.

Principle 1:   Price integrates the volatility (risks) of projected future Cash Flows

In my first post on pricing, I indicated the difference between financial products and services versus other manufactured goods. Again the main difference is that profit margins are realised during the whole life of the product, by opposition to profits realised at the day of the sale of a manufactured good. This difference is essential because profit margins are dependant of future volatile cash flow. The volatility is generated by all the risks associated with each product sold.

Profitability cash flows of Manufactured Goods versus Financial Products

Profitability cash flows of Manufactured Goods versus Financial Products

What are the risks that need to be defined, quantified managed and expensed to the client in pricing? (1) Credit Risk, (2) Interest Rate Risk, (3) Liquidity Risk, (4) Foreign Exchange Risks, (5) Business Risks, (6) Operational Risks, (7) Regulatory (Compliance) Risk.

Principles of good risk management are based on the separation between expected and unexpected risks. They require differentiated treatments in pricing!

  1. Expected Risks are managed through (1) hedging by collateral or other risk mitigation technics and (2) provisioning strategies. In both cases the      “cost” of managing the risk is passed on to the client through adequate      pricing.
  2. Unexpected Risks are managed through equity. The equity adequacy regulations (BII & III) define the amount required to be invested by shareholders to cover the activities of the bank. The cost of managing those risks is born      by the shareholders.

The quantification and management of all these risks is complex and require fundamental adjustments in the bank’s process, models, competencies and strategies.

Risks are also multidimensional and must also be viewed on a contract basis (variance management) and on a portfolio basis (covariance optimisation). Covariance management is done within each risk portfolios and between different risk portfolios. We can develop different way of integrating covariance costs/ benefits in pricing?

These questions need to be discussed at bank level as different approaches are possible. The approach will be described in the bank’s Risk Policies and Procedures and in the bank’s Pricing Policy and Procedure.

Principle 2:  Individualise all risks, price and manage to optimise their variance/ covariance on a 1:1 basis

Banks are moving from standardised products (amount, maturity, payment frequencies…) towards 1:1 marketing and product adaptation to specific client requirements (if not tailor making). The characteristics of each financial contract are then specific and require 1:1 pricing.

Client behaviour analytics are showing major differentiations between client and even client segments. The future expected behaviour can be defined and analysed. It is a crucial element of the products’ future cash flows and the value of the client relationship. To achieve this, data granularity is important. Technology allows such approaches and must be leveraged.

Risk concentrations and diversifications will be managed at product, client, market and bank portfolio levels. Price and management of all risk classes imply the integration of risk covariance (Interest rate and credit risks; credit risk and liquidity risk…).

To allow the implementation of such a risk adjusted model, the bank must develop a complete set of contract level Risk Transfer Prices (RTPs) with the appropriate valuation models. This includes Fund Transfer Pricing, Credit Transfer Pricing, Liquidity Transfer Pricing… and Expense Allocation/ Transfer Pricing.

The development of such an integrated risk based model is transversal and vertical across the whole organisation. It requires a full impact analysis and feasibility analysis. There are recommended methodologies to define the impact of such a business model, starting with a clear definition of the Business Principles to be implemented, the Assumption underlying the project and the Constraint applicable to the strategy. This is not a trivial project.

Principle 3: Understand the client elasticity to pricing on a multidimensional factor basis.

Most marketing/ sales/ distribution departments measure performance on the basis of volume (new transactions, campaign hit rate and transformation rates…), with very little on 1:1 pricing. Mispricing will negatively impact product sales propensities, in ways that go beyond a simple a sales volume metric.

For example the bank can integrate price elasticity with client solvency. If not there is a high probability of selling the wrong product to the wrong client at the wrong price. Low (good) rating prospects will have zero purchase propensities to high priced credit products, but high risk rating prospects will have high purchase propensities of loan products whatever the price. Pricing on the basis of average risk ratings and rating cut-offs rather than individualised risk based pricing, will lead to a risk degradation of the credit portfolio and of its risk adjusted return!

sales propensities per risk rating and risk price

sales propensities per risk rating and risk price

Too many retail banks are still using standard product prices that they only adapt to the customer through a “negotiation/ commercial margin” that is assigned to the sales unit. This is NOT individualised or 1:1 pricing, it just allows weaker sales unit to sell on price, and the strong client to force pricing down.

Bad pricing is a source of attrition of your best clients, those that will create sustainable profits, LTV. This is an important business risk resulting in a steady erosion of the bank’s value (goodwill).

Principle 4:  Define, quantify and manage business risk

What is business risk?

From the bank’s perspective Business risk covers a very wide set of issues, but for our discussion we will limit it to the risk of not achieving the expected (budgeted) performances in client relationship growth and profitability. This may be due to a number of internal and external factors. Again we will restrict this to the business risk from a sales/ marketing perspective, in which case we can define it as the risk that the client and prospect do not meet the projected behavioural estimated targets planned by the bank. That the sales propensities were lower (or higher) than anticipated, that cross sales and up-sales do not meet plans, that attrition rates were higher (or lower) than expected… Consequently the bank does not achieve its planned growth and profitability, its return on marketing costs.

business risk management by targeting improved performance and reduced volatility of the targeted performance

business risk management by targeting improved performance and reduced volatility of the targeted performance

One way of looking at it (and model this) is to estimate an expected result (for example the client’s LTV) and the dispersion around that expectation, measured as the Standard Deviation (STD). Managing the business risk is then the science/ art of increasing the expected LTV from A to B, and to reduce the STD of the return from STD1 to STD2, by implementing the appropriate strategies in:

  • Product development
  • Distribution strategies
  • Marketing campaigns
  • Pricing strategies
  • Etc…

Note that this 100% different to classic satisfaction surveys or other methodologies often used to estimate client behavioural variables. Indeed client satisfaction surveys do not allow the bank to measure and manage its growth and profitability targets, as it does not quantify the factors generating growth and profitability. What you need are behavioural models that can discover those factors and explain the causalities of client actions and expected future behaviours.

Principle 5:   Adapt the pricing strategy to the banks market strategy and client value proposition

Before you can define your pricing strategy, the bank must decide what its market vision and strategy is, what value proposition it wants to propose to the market.

You may want to use the approach developed by Michael Treacy and Fred Wiersema (The discipline of Market Leaders) or any other that focuses on what and why clients are buying from one bank rather than another.

Treacy and Wiersema suggest that customers will seek 3 core values from its suppliers of goods and services: “client intimacy”, “operational excellence” or “product innovation”. They also state that companies that are clear leaders in one of these values and hold a strong at par position in the other two will be the market leaders and have above par performances. Finally, the analysis shows that the business models (process, organisation) and the technical requirements (data, application, technology) are different for the three strategies, hence no one can hope to be the best in the three value sought out by the clients. Note also that Customer Intimacy is not equivalent to Client Centricity! Client centricity is to focus on delivering a product/ service that maximises the client’s satisfaction which can be either operational efficiency or customer intimacy.

In regards to retail banking two of these approaches are clear options: Customer Intimacy (Developing a 1:1 relationship based on analytics and proprietary information) and Operational Excellence (Mass marketing of standard products delivered at the best price and without any operational glitches). Product Innovation is more difficult to apply to banking because of a number of factors which I will not expand on here and now.

Customer Intimacy versus Operational Excellence bank models

Customer Intimacy versus Operational Excellence bank models

Imagine two banks with two different value approaches, where Bank A wants to focus on customer intimacy, while Bank B is targeting operational excellence. They will need to organise and build two very different types of banks, with different organisations, process and skills based on specific technology and analytics. They will also develop fundamentally different pricing strategies.

I will soon continue the review of the remaining pricing principles. Meanwhile give me your comments and be advised by email of the following posts!

Leave a comment

Filed under Pricing Part 4