P-LEI.org plateau

P-LEI.org was launched in July 2013 to provide a consolidated view of reference data published by pre-LOUs in the nascent Global Legal Identifier System (GLEIS), by mapping each pre-LOU’s published files to a common format defined by P-LEI.org. P-LEI.org was, and is, a pro bono service provided free of charge, for the public good, by its collaborating sponsors: GS1, Tahoe Blue Ltd, FIX Protocol Ltd, and Corporation for National Research Initiatives.

In the first half of 2014, the Global Legal Identifier Foundation (GLEIF) was formally established and a common file format was defined by the LEI Regulatory Oversight Committee (ROC) for the publication of LEI reference data by pre-LOUs. While influenced by P-LEI.org, the ROC common file format is different from the format in which P-LEI.org publishes consolidated data. By now, nearly all pre-LOUs publish their reference data in the ROC common file format, and some have discontinued the use of the proprietary formats used previously.

The introduction and adoption of the ROC common file format has made P-LEI.org’s primary function of consolidating proprietary formats less important for users of LEI reference data, and it is expected that the GLEIF will publish its own authoritative consolidated LEI data set.

As a result of these factors, the sponsors of P-LEI.org have decided not to offer a comparable consolidation service for LEI data published in the ROC common file format.

P-LEI.org has continued to process the “legacy” proprietary LEI data files that some pre-LOUs have elected to continue to produce. However, as more and more pre-LOUs have discontinued the redundant publication of LEI data in their old formats, the consolidated dataset compiled at P-LEI.org is becoming increasingly out of date.

All of the sponsors of P-LEI.org are happy to have provided a useful service to the LEI user community, and look forward to the continued evolution of the Global LEI System.

“Pre-LEI” consolidated data portal now available: p-lei.org

Along with GS1, the Corporation for National Research Initiatives (CNRI), and FIX Protocol Limited (FPL), Tahoe Blue Ltd has joined to create a portal for downloading datasets that consolidate the “pre-LEI” registrations of the currently active “pre-LOUs” (Currently the CICI utility operated by DTCC and the GEI site operated by WM Datenservice).

Click on the image below to access the site and register in order to download the consolidated pre-LEI datasets.

p-LEI

Legal Entity Identifier (LEI) update: FSB global standardization efforts

I recently attended the LEI Workshop conducted by the Financial Stability Board (FSB) on March 28 in Basel, Switzerland at the offices of the Bank of International Settlements (BIS).  This workshop brought together representatives of many countries, agencies, organizations, financial institutions, and corporations from around the world.

The FSB, as directed by the last meeting of the G20, is tasked with developing a global standard for the design, administration, allocation, and dissemination of a globally unique Legal Entity Identifier for use in, ultimately, all financial transactions.  As businesses and corporations in all industries must conduct some degree of financial transactions in order to manage strategic assets or participate in currency and commodity markets, the impact and scope of the implementation of a global LEI transcends the financial industry, where the proposed use of a new global standard has obvious impact.

The FSB is in a bit of a ‘time-is-of-the-essence’ time frame window, spurred to make definitive progress on establishing a global mechanism for issuing LEIs due in part to (1) the fact that the CFTC has stipulated that swap derivatives dealers that fall under its regulatory auspices are to begin using an LEI this June and (2) the G20 has instructed that the FSB report on the formation of global LEI standards at the next meeting of the G20.

A full report on the numerous issues and alternatives that are occurring on both sides of the Atlantic with regards to converging on the business requirements and technical aspects of a global LEI system is beyond the scope of this update. I have prepared a more focused recommendation on a key aspect of the discussion regarding the use of the LEI code field and the concept of a federated approach versus a central approach to the establishment of LEI Registrars. That recommendation is attached as a PDF to this discussion topic.

I would urge you to review this proposal, as I do believe that it provides a means whereby ‘early adoption’ of putting an LEI process in motion by the CFTC can be harmonized with the FSB’s task of developing a specification and design for the implementation of a global LEI.  It is quite important that the confluence of these two developments result in a productive and additive benefit to the financial industry (and global economy for that matter), as opposed to one in which the efforts clash and end up hobbling the benefits of establishing a global LEI while the compliance costs of supporting and participating in a global LEI system would no doubt remain undiminished.

The FSB has indicated that all comments and feedback as input to its deliberations must be submitted by April 16 in order for them to have a final report by the end of the month.  I have submitted my attached recommendations to the FSB already, and I look forward to comments or questions from the community on this important subject.

Recommendation for LEI issuance via authorized federated registrars – rev 1-04

Apple Internet TV ? or, Google ?

Apple Internet TV ? or Google ?

There is much speculation of the anticipated ‘next move’ of Apple, notably in the area of taking Apple TV to the next level, a la providing of Internet TV via the iTunes framework.

However, I believe that the real opportunity in the next disruptive wave of video, movies, TV, and the Internet is more likely to rest with Google, particularly with their acquisition of Motorola and the doorway to developing a truly Internet/TV integrated set top box.

This is because, in my opinion, the real breakthrough in both the Internet and cable TV programming is going to come when the full bandwidth of the broadband coax cable is used almost entirely for the Internet Protocol (IP) stack.

Currently, the broadband bandwidth of coax is inefficiently allocated and compartmentalized in frequency multiplexed video bandwidth slots for dedicated cable TV programming, with just a couple of such slots reserved for carrying IP traffic.

When the full bandwidth of coax is used to stream on-demand video via IP (and multi-casting so as to avoid duplicate streams for the same programming), the true convergence of the Internet and TV (and search, and clickable advertising, etc) will begin to be realized. Instead of offering the services separately, the real disruption will occur when the medium is integrated at the network, datalink, and wire level (levels 3, 2, and 1).

As others have pointed out, the Cable TV industry is not ‘hurting’ like the music industry was when Steve Jobs pulled 99-cent track downloads out of his rabbit hat. Google and Motorola can easily provide the technical means to integrate both content providers, broadcasters, ISPs, and Cable TV infrastructure sectors — but the business model of the existing Cable TV market providers will have to evolve and be attractive to the Cable TV franchise. The bandwidth of the Cable TV coax is so high that it will be a very long tme before competitive “last mile” delivery infrastructure (i.e., fiber) can reach the critical mass to replace or threaten coax.

The Cable TV providers see that long-term trend coming, and it will provide them with the incentive to work out new relationships with the consumer to bring new life into the existing coax network.

And can you imagine how the real consumer economy, and the advertising and marketing efforts of any and all manner of business, will respond when it is possible to integrate clickable Internet links on top of, superimposed, and synchronized with, video, movie, and TV programming of all kinds — including commercials, of course ?

It will be absolutely HUGE !

Byte-metered pricing rears its ugly head in the name of ‘net neutrality’

The following is a critique in response to an article on Slate posted by a Farhad Manjoo, who tries to make the case that Internet pricing by the byte as opposed to by bandwidth is not only inevitable but preferable (or, at least, “reasonable”).  Mr. Manjoo’s comments can be viewed here .

Mr Manjoo,

Your perspective is misguided and out-of-whack, as is your pronouncement of what rates seem “fair” or “reasonable”, for several reasons.

For starters, charging by the byte is more a measure of *storage* cost, and not appropriate to the concept and practice of streaming experience which the internet has quickly become — and for which the marketing and advertising promotions of the ISPs and Telcos are largely responsible !

Consider this:  would you propose that the billing paradigm for cable TV be changed such that it is based on the amount of video information which is broadcast over cable channels ? It is quite easy to measure the amount of information bits contained in cable video transmission, and you could extend your rationale to suggest that viewers of cable TV programming have a certain allotment of total informational bits which they can consume in a month, and which they must budget, lest the charges for the service be escalated.  If you watch too much TV and exceed your allotment, you pay more.

When you elect a cable TV selection of programming, the assumption is that you can ‘watch’ that programming as much as you want in any given month in exchange for the monthly charge.  In other words, the contract is based on agreeing to pay for a given level of streaming, always on, experience — not the number of video bits which are pushed down the cable.

With regards to connectivity to the Internet, if an ISP quotes a certain level of bandwidth/time (e.g., 5 Megabits/sec), that is the value proposition which a consumer elects to pay a certain amount for on a monthly basis.  Lower bandwidth is lower cost, and the quality and capabilities of the informational or media experience changes according to the bandwidth elected.

Consumers have no clue as to how many bytes it takes to render a particular web page, nor how many bits are being streamed to provide a given level of audio or video resolution.  That is a completely foreign value proposition for the contemporary Internet consumer, and your estimations of how many ‘bytes’ are reasonable represents both a static and irrelevant measure of what a consumer is expecting to pay for.

I am not suggesting that consumers be given ‘infinite’ bandwidth for a fixed cost, mind you, and I certainly recognize that per-user bandwidth quotations of quality of service has a significant total network infrastructure capacity requirement and expense.  But charging by the byte as a means to somehow reconcile total system bandwidth with aggregate consumer demand is bass-ackwards, and guaranteed to ultimately cause dissatisfaction and disruption of both consumer demand as well as impair the continued growth and expansion of the entire Internet-based economy.

Charging by the byte simply offers the ISPs and Telcos the ability to continue to grow revenue without concomitant improvement and investment in Internet infrastructure, and turns the Internet into a fixed utility of bandwidth-capped pipes appearing to deliver a finite, consumable resource like water or natural gas, as opposed to the far more unlimited ability to simply move electrons or photons back and forth at will without exhausting the bits, or needing to ‘store’ them.

What the ISPs and Telcos need to do is simply quote bandwidth at a rate that they can deliver to the growing population of consumers, and use the growth in consumer volume as the source of revenue to fund ongoing capital investment in infrastructure and total system bandwidth to meet the growing aggregate demand.   LOWER the continuous bandwidth per time per user if need be in order to accommodate supplying continuous bandwidth at that level of service to the customer population !  Charge more for higher bandwidth, yes, but do NOT quote ultra-high-speed bandwidth for a rate which the infrastructure can not support, and which would force you into ‘metering’ out the total bandwidth in a piecemeal, non-continuous, and potentially interruptable service based simply on number of bits or bytes.

Your estimation of what is ‘fair’ or ‘reasonable’ in terms of bits or bytes per month is totally unfounded,  would lock in a fractured and fragmented Internet experience calibrated at current levels of capabilities, and would proceed to make that capped experience subject to outages and interruptions based on unexpected budget constraints as underlying amounts of data for certain types of services would necessarily evolve.

You display your most egregious lack of appreciation of what the Internet is really about with the following comments:

And say hooray, too, because unlimited data plans deserve to die. Letting everyone use the Internet as often as they like for no extra charge is unfair to all but the data-hoggiest among us—and it’s not even that great for those people, either. Why is it unfair? For one thing, unlimited plans are more expensive than pay-as-you-go plans for most people. That’s because a carrier has to set the price of an unlimited plan high enough to make money from the few people who use the Internet like there’s no tomorrow. But most of us aren’t such heavy users. AT&T says that 65 percent of its smartphone customers consume less than 200 MB of broadband per month and 98 percent use less than 2 GB. This means that if AT&T offered only a $30 unlimited iPhone plan (as it once did, and as Verizon will soon do), the 65 percent of customers who can get by with a $15 plan—to say nothing of the 98 percent who’d be fine on the $25 plan—would be overpaying.

The Internet is NOT a storage device !  And your assessment as to what constitutes reasonable ‘use’ of the Internet, based on current usage (and justifications by AT&T ! ) is backward-looking.  You might as well be saying that, because 98 percent of the population “gets by” with wagons pulled by horses, there is little need for the horseless carriage !  That is sooo 1890 !!  Best to look ahead to the future, not in your rear-view mirror.

You need to rethink your concept on Internet billing, or you will soon find yourself in the position of needing to calculate the equivalent of how far you can drive on a fixed battery charge in any given month.

BAD IDEA !

A Unique Opportunity for a Win-Win in Financial Risk Management

Seeking to manage, or at least measure, financial and systemic risk by a closer inspection, analysis, and even simulation, of financial contracts and counterparties at a more precise level of detail and frequency is a significant departure from many traditional risk management and supervision practices.

The results of more detailed contractual and counterparty analysis — which, taken together to reach the level of the enterprise from the bottom up — offers far better insights into the risk dimensions of both a firm, as well as the financial system, than practices in the form of large-brush composite risk measures and coefficients applied to balance sheets from the top down. The latter greatly reduces the burdens of compliance and regulatory reporting, but it is also much less informative and accurate.

The goals and objectives of accounting practices and methodologies are generally to provide an accurate view of the financial condition and activity of a firm as a going concern, to the extent that one-time events are noted as exceptions, and the effects of other transactions such as capital investment, revenue recognition, and depreciation are spread across a wider horizon of ‘useful life’.

When accounting methodologies aimed generally at showing a longer-term view of a firm’s ‘trailing average’, or smoothed-out, behavior are applied to risk management, artifacts can arise that obscure or mask risk, and the one-time events which accounting practice seeks to footnote should in all likelihood be headline topics for risk analysis such as stress testing.

Some fundamental premises of how best to respond, as a firm and as an industry, to regulatory requests for more detailed contractual and counterparty information:

1) Individual firms should seek to turn what is traditionally viewed as a non-productive overhead cost of regulatory reporting and compliance — in light of the new mandate to provide more detailed financial information — as an opportunity to map contract-level financial positions into a financial data repository that spans the entire balance sheet (and off-balance-sheet) of the firm and allows across-the-board analysis and stress testing using level-playing-field scenarios and assumptions.

Currently, most firm’s product divisions are isolated in operational silos with proprietary systems-of-record formats and potentially incompatible risk management methodologies and assumptions. Implementing this approach will result in risk measurement and management practices which are more timely, more comprehensive, and based on higher-resolution detailed data — a ‘win’ for the firm.

Firms should not view the proposal to create and populate such a database as an attempt to jack up their entire financial operations and insert a new ‘ground floor’, nor to be a replacement for internal data warehousing initiatives. Rather, the model is to allow an interface database to be populated with appropriate mappings from existing systems within the firm.

As such, such a database, though comprehensive and more detailed than traditional G/L reporting data stores, is not ‘mission critical’ , but more along the lines of decision support — and could provide a platform for productive use to that effect internally within the firm as well.

2) The industry as a whole, in conjunction with regulators, should strive to agree on a common standard to represent low-level financial positions and contracts such that each firm does not create its own proprietary version of such a data model, one which then requires further re-mapping and translation (and likely incompatibility) at the level of systemic oversight.

Having the financial industry and the public regulators agree on a common data model, with requisite standardized reference data, will be a “win” for the public good as well, as it will greatly reduce the cost and complexity of making sense of more detailed financial information for purposes of analysis at the systemic level in the Office of Financial Research.

Finally, by implementing a form of distributed reporting repository, if you will, each institution can make the database available on a secure basis to not only regulators but also to internal staff as well as vendors who can supply value-added reporting and analysis tools predicated on the standard model. This is yet a third win for economic efficiencies that would make available better risk management practices to a wider range of institutions who otherwise would not choose or be able to develop such tools.

The increased regulatory requirements mandated by the recent passage of the Dodd Frank Act are no doubt not a welcome development for financial institutions, and the challenges to fulfill the specific functions of the Office of Financial Research as delineated therein would be formidable even with full cooperation from the industry. However, given that the work needs to be done, and time and effort expended, it clearly is in the best interests of the financial industry and the public if the projects can be pursued in a manner that will produce substantial long-term benefits to offset the additional costs incurred.