Finding end-user models
Harder than hunting for needles in the haystack
The SAS Institute platform is a powerful tool that offers a host of data aggregation, data cleansing and analytical tools to your analyst community. Because of this breadth of capability, you may have analysts or teams creating what regulatory groups would consider models subject to model governance and compliance review. Furthermore, many SAS workloads qualify under the CECL, CCAR and IFRS9 regulatory guidelines as End User Computing workloads, which need to be inventoried, reviewed and placed under a governance scope.
Do you know who these teams and analysts are? Can you prove that your model governance process has identified all these models? Chances are, many of these models and End User Computing instances have gone unidentified, in many cases because they’re hiding in plain sight. SAS workloads can be created, executed and results generated through multiple means; there isn’t just one way to execute a SAS workload.
Tale of Three Cities
Modernizing traditional SAS business applications to Viya on AWS
Clients ask us this question frequently: “…Should we stick with our business applications on SAS 9.4, or should we take the plunge into SAS Viya? I’ve heard there are some functional differences between the two, and some of my colleagues are concerned that Viya implementations still feels new and a little risky. As a professional designer and implementer, what are Corios’ assessment of the risks and challenges (and benefits)?”
Everyone’s mileage will vary, but here is our experience with migrating three clients from traditional SAS to Viya on AWS. Our clients include a credit card issuing-bank, a commercial bank, and a business financial data clearinghouse. Read More
When the world hands you lemons, make lemonade
Adapting to market conditions with innovation
We can’t control the market around us, especially in the economic downturn produced by the pandemic, but we can choose how to respond to it.
In a low-interest rate market environment like the US has experienced for the period 2008-2015 following the credit crunch, and again for the period April 2020 onwards, this makes it very challenging for banks to earn interest margins through lending. One strategy that some of our clients have followed is to encourage their commercial banking clients to make the most of treasury services, which generate fee income and aren’t dependent on interest rates. These treasury services are varied transaction and information services that help businesses manage their cash deposits and inter-party transactions. The image on the right illustrates the federal funds rate history for the period 1954-2021, with emphasis on the most recent years. When the federal funds rate is low, the interest rate that banks can charge on loans to their customers is correspondingly low.
Go Back, Jack, Do It Again…
Building a model pipeline twice, in both conventional and open source contexts
(Credits to Steely Dan for the title)
Corios was hired by a rapidly growing bank to build the newest release of their prospect acquisition scorecard model; not once, but twice: once in SAS 9.4 (their production environment), and the second time in a hybrid SAS Viya / open source approach that leveraged Python, Dask and Spark. The reason for the second modeling effort was to explore what an innovative, modern cloud-focused analytic environment could and should look like to support predictive model lifecycle management: authoring, champion/challenger experimentation, validation, version management, cloud deployment, drift analysis and ongoing refresh.
Building the first, traditional model pipeline was familiar territory for us because we had built several models for this client, that had been put into production over the past few years. The greatest challenge was that the bar was set very high for the mathematical performance of the model, since we had to beat the performance of the current version, which was constructed effectively and exhibited strong performance.
The second model broke a lot of new ground for the client. Major elements included: Amazon Web Services clustered compute, storage, code development and management; Python, Dask and Spark as open source frameworks for analytic pipeline development; side-by-side comparisons for analytics assets built in familiar territory (SAS) and unfamiliar territory (open source frameworks on cloud services); and novel analytics techniques (and their potential performance contributions) that the open source frameworks made available to the bank for the first time in a native, business-critical context.
Picking Up the Jellyfish
Modernizing analytics practices for 800+ insurance analysts
Corios was hired by a prominent insurance carrier to modernize their analytics and data practices for all things analytical: underwriting, pricing, claims, repairs, coverage, compliance, and regulatory support. They wanted to reduce the cost of data storage, to align all their analysts on a consolidated set of tools and environments, and to modernize the enterprise so they could react to climate events and other large-scale adverse events faster and more efficiently.
The Corios solution we use in these engagements is Corios Rosetta, which includes Corios software and service methodology to inventory, score, prioritize and modernize our clients’ SAS data and analytics assets. After inventorying their workloads, data and teams, and interviewing leadership and subject matter experts, we recommended to centralize their workloads that relied on their primary atomic-level data warehouse (in Oracle), and to move their non-warehouse workloads and analysts to the use of Python on Domino Data Labs for virtual analytic environment provisioning and archiving. Then we invested the next 6 months modernizing the work of their 800+ analysts along this roadmap.