Testing the Futures and Options Deployable Software
After the futures and options deployable software has been downloaded, clients can start testing the workflow and generating margin responses. This topic includes the following sections related to Margin Software:
Loading the Risk Parameter Files
Downloading Risk Parameter Files
- The rpfs (Risk Parameter File) directory in deployable needs to be populated with the unzipped RPF corresponding to the trade portfolio.
- Navigating to the RPF Directory from the SDK: arcdl-sdk - arcDlApplication – rpfs.
Please note the deployable margin software version number is tightly coupled to the version of market data file the user selects to load. This user guide indicates compatible market data packages for a given build and how to run with limited backwards/historical file compatibility for SPAN RPFs.
Downloading Risk Parameter Files from Secure FTP
- SPAN Risk Parameter Files (RPFs) for the futures and options deployable software are available through CME's SFTP site. Users will need SFTP access to a new location to download the deployable Risk Parameter Files (RPFs).
- Standard (non-chunked) risk parameter files:
- SFTP file location at client local site: cme/ftp/FIRMID/pub/SPAN2/rpf
- File Names:
- Production, containing all contracts active in SPAN 2 framework as of the date posted: SPAN 2 filename: cme.span2.yyyymmdd.\[cycle\].zip
- Test files, containing all active contracts from production plus new contracts active a test period for new asset classes: test.cme.span2.yyyymmdd.\[cycle\].zip
- Standard (non-chunked) risk parameter files:
- In the above, the 'cycle' will refer to the clearing cycle code produced throughout the business day and include c (complete EOD), i (ITD), s (early EOD).
- Deployable software default RPF location: <install directory>\arcdl-sdk\arcDlApplication\rpfs
- Chunked risk parameter files, available for use with versions 3.1.10 and above
- The chunked RPFs support two use cases:
- User type 1, who require all RPF data across all CME listed contracts but want to manage download performance time by downloading data packages in parallel.
- User type 2, who do not need all CME listed RPF data and instead want to download a subset of RPF data packages to satisfy the risk needs of the contracts in their portfolios.
SFTP file location at client local site: cme/ftp/FIRMID/pub/SPAN2/rpf/[cycle code]/yyyymmdd/chunked
Instructions for File Use
User type 1 - download all data chunked packages
- Download all files from the directory location described above labeled as “common” and “chunk.” The file name conventions are:
- Common files: cme.span2.yyyymmdd.[cycle code].common_[number].zip
- Chunk files: cme.span2.yyyymmdd.[cycle code].chunk_[number].zip
- Note files with the naming convention prefix “test.” support contracts currently live in a test window, while those without “test.” contain those live in production.
- Note the number of common and chunk files is variable.
- Unzip all files in a single command.
- A single folder will be created containing all data.
- Using the standard process of pointing the program to this new RPF folder, start the program, initializing with chunked RPF data.
- Download all files from the directory location described above labeled as “common” and “chunk.” The file name conventions are:
User type 2 - download a subset of targeted data package
- Download all common files from the directory location described above. All common files are required to run the program. The file name conventions are:
- Common files: cme.span2.yyyymmdd.[cycle code].common_[number].zip
- Note files with the naming convention prefix “test.” contracts currently live in a test window, while those without “test.” contain those live in production.
- Note the number of common files is variable.
- Download the product map file. The file name conventions are: [test. - if a test environment file, else omitted]exchangeProductMap.json
- Use the exchange product map file to interpret the list of chunked files needed for the users’ portfolio(s). This file contains a list of required chunk file names organized by exchange code (CME, CBT, NYMEX). The field 'productCode' is reserved for future use and will be enhanced to support cleared product codes used in client portfolios.
- For example, a user wants to download only NYMEX contracts because they only trade on the NYMEX exchange, they could read the list of required chunks where exchange = NYMEX.
- Download all chunk files as determined above from the directory location described above. The file name conventions are:
- Chunk files: cme.span2.yyyymmdd.[cycle code].chunk_[number].zip
- Note files with the naming convention prefix “test.” support Equities in SPAN 2 framework, while those without “test.” contain only Energies in SPAN 2 framework.
- Note the number of common and chunk files is variable.
- Follow steps 2-3 in User type 1 instructions above.
- Download all common files from the directory location described above. All common files are required to run the program. The file name conventions are:
- Deployable software default RPF location: <install directory>\arcdl-sdk\arcDlApplication\rpfs
- The chunked RPFs support two use cases:
Initializing RPF Data in Deployable Software
- Users must unzip the SPAN and SPAN 2 RPFs prior to initializing the software. The unzipped files are folders with the following names:
- SPAN 2: yyyymmdd_FNO_SPAN2_C
- Future state filenames will update the _C suffix to describe new daily cycles (i.e. _S, _I).
- Files should be unzipped at the top level only, nothing within the unzipped files requires unzipping/editing; all data within is obfuscated except for the SPAN risk array file.
- The software requires the 'yyyymmdd_FNO' prefix in this filename to run.
- SPAN 2: yyyymmdd_FNO_SPAN2_C
- Location: <install directory>\arcdl-sdk\arcDlApplication\rpfs
The location of the SPAN file within the file structure is <install directory>\arcdl-sdk\arcDlApplication\rpfs\\[unzipped file\]\marketdata
Users can add SPAN files here to compute inter-exchange spread credits, for instance between CME and MGE
- Users can switch between multiple points in time when there are multiple dates of loaded risk parameter files. The argument controlling the loaded risk parameter file is in the start_windows bat/start_linux.sh.
- The argument is -DfnoRpfDirectory=\[market data unzipped filename i.e. yyyymmdd_FNO_SPAN2_C\].
- If multiple files are supplied with the same business date in the root directory and the above argument is not used, deployable will use the following logic to initialize with the "latest" rpf file: first attempt to initialize a file with the most recent date in the filename pattern (i.e. yyyymmdd_FNO . .); if not satisfied, then the dataset with the most recent local timestamp will be initialized.
- It is recommended users utilize the -DfnoRpfDirectory argument described above to pass reference to the specific dataset to limit issues with multiple daily data sets.
Best Practices for Using CME Risk Parameter Files
- Users are not expected to manually edit the contents of the CME risk parameter files other than adding additional SPAN files to the /marketdata directory when necessary (for instance to compute inter-exchange spreads with the MGE SPAN file).
- Users are not expected to change the naming conventions of any of the contents of an RPF file.
- The RPF file structure can change from time to time.
- CME does not support non-CME SPAN files which do not meet the formatting requirements of the deployable software. These files may load incorrectly or not at all during initialization.
Limited Backwards Compatibility
- The deployable software contains limited backwards compatibility for older versions of the SPAN risk parameter file This is intended for margining historic points in time.
- When a user supplies a SPAN RPF that the deployable software determines may not be compatible (i.e. is from an older point in time), it will log out this error: WARNING: Current configuration in RPF not compatible with deployable version. Overriding for backward compatibility.
- Program will continue to initialize data supplied but results should not be considered for number-matching against SPAN software.
- It is not recommended to initialize the software with older versions of the SPAN 2 RPF. This will result in calculation failures.
Using the SPAN Over-ride Feature
- Deployable software can read from SPAN files outside of the standard RPF data structure via the SPAN Over-ride feature. This is intended for use while loading historic risk array files in SPAN format and computing cross-margin benefits between CME and OCC in the SPAN model.
- Loading historic risk array files, including the legacy .pa2 and .xml files:
- The deployable SDK allows users to specify a legacy SPAN file in an override directory during start-up.
Please note this workflow requires a user to supply a compatible SPAN only RPF file (i.e. cme.span.yyyymmdd.\[cycle\].zip) prior to starting the initialization process. More details regarding the SPAN RPF files can be found in 'Risk Parameter Files' above.
- Supplying a SPAN 2 RPF will invoke the SPAN 2 methodology which is not expected for the historic margin workflow
- Users should create a new directory or directories for the desired SPAN pa2/xml files and download/zip/store file(s) to this location.
For example, the user could set up a new directory locally as "\[internal path\]\arcdl-sdk\arcDlApplication\SPANFILES"
Users should specify the -DspanFilesDirectoryOverride=\[local directory location\] configuration in the start_windows.bat or start_linux.sh file.
For example, the user stores SPAN files in a new SPANFILES directory and starts the software with -DspanFilesDirectoryOverride= \[internal path\]\arcdl-sdk\arcDlApplication\SPANFILES argument in the start file. Please note it is expected users download the standard CME historical SPAN file in .pa2 or .xml format. The 'c21' version of the .xml file is not expecte
- See example in Initialization of RiskAnalyticsService later in this document.
- Users can employ the same process as above for processing cross-margin SPAN files (OCC/CME). For this workflow, it is expected the user would only supply the cross-margin SPAN file from this FTP site: ftp://ftp.cmegroup.com/span/data/xma to the SPAN over-ride directory. This is imperative because CME/OCC SPAN files represent cross-margin only products, not to be confused with the normal CME products in a CME SPAN file.
Redis RPF Load
- It is highly recommended to use Redis cache for loading SPAN 2 risk parameter files during production use.
- When using Redis, users will need to set jvm arguments (-DcacheMode=Redis) or property
(System.setProperty("cacheMode", "Redis") before initializing the deployable service.
- Users can utilize distributed Redis cache or local Redis cache.
- Local Redis means that your application and Redis cache are running on the same server.
- Distributed Redis cache can be used to improve memory management depending on the expected usage load.
Local Redis Cache Load
- Unzip the SPAN 2 RPF under (same as disk load) <install directory>\arcdlsdk\arcDlApplication\rpfs.
- Below is an example of RedisConfig with basic configurations:
- Users can configure a timeout using the withTimeout (number of milliseconds) property. The default for this property is 60 seconds (60000 ms). An example of the timeout configuration can be found in the examples "ExampleMainWithTimeout.java".
- SPAN 2 RPF will be loaded to local Redis during deployable startup. “withLocalRedis” determines if the RPF is to be loaded or not by the deployable startup. The default behavior, if loading is happening, is to clear the cache. For users who do not wish to follow the default cache clear behavior, the flag clearRedisCacheOnLoad can be used to retain the cache during load. Please review the README.md for more details.
- The default local Redis port is 6379.This default can be changed while creating RiskAnalyticsService as below.
- RiskConfig.getRedisConfig() returns RedisConfig where you can override port.
- Please refer to the ExampleMain class in the SDK.
Distributed Redis Cache Load
- Users can load SPAN 2 RPF into central cache and can be used 'n' number of deployable instances.
- Unzip the SPAN 2 RPF under (same as disk load) <install directory>\arcdl-sdk\arcDlApplication\rpfs.
- Load cache by providing the SPAN 2 RPF folder and redis details as below:
Please refer to the ExampleCacheLoad class located in the SDK.
- Start deployable with property -DcacheMode=redis and with redis server details. Please refer to the ExampleMainDistributedCache located in the deployable SDK.
Clearing the Redis Cache Load
- Users can manually clear the central cache if needed:
- Use clearCache to completely clear the cache.
- Data in the cache can be deleted by RPF loaded by using deleteFromCache.
- When loading the cache the default is to clear it first. To override this behavior set the property clearRedisCacheOnLoad to false.
- Refer to ExampleCacheLoad class in SDK.
- Manual cache clear is only supported for distributed cache, if using the local Redis cache simply create a new RiskAnalyticsService instance instead.
Margin Request and Input Portfolio
- The source code under the examples/arcdl-example directory is an example of how to correctly call the API. The source under the examples/arcdl-web-example is an example of a spring boot application that allows the API to be called over REST.
- In the class MarginController there are examples on how to use the risk engine interface RiskAnalyticsService taking either CSV or JSON format. Using this a developer can imbed the arcdl deployable jar and risk library within their own application.
- Build an input portfolio using the Risk API schema found in the CME CORE software center or by using the examples as a guideline. (See appendix for Risk API input format examples in .JSON and CSV)
- The RiskPortfolioRequestMessage structure will be organized by categories and further detailed through a subset of attributes within each category. Below is a class snippet below:
Margin Call and Margin Request
- The MarginController class is where to find the call to retrieve the margin calculations. This is shown below, and it takes a RiskPortfolioRequestMessage object.
- The margin service takes a RiskPortfolioRequestMessage object and returns a MarginDetailResponseMessage object.
- The snippets below describe the class structure of the object sent to the margin service, those where there are tags @XmlElement or @XmlAttribute are the way that the JSON message is mapped to object instances of these classes and vice versa. Note that not all values need to be returned unless they are marked as required.
- The RiskPortfolioRequestMessage snippet is below.
- This consists of a payload object called RiskPortfolioRequest. This contains the point in time details, which should match the RPF date that is used to calculate margin on the call.
- The RiskPortfolio object holds details of each of the portfolio being submitted for margin. A user can submit multiple portfolios at once. A snippet of the RiskPortfolio class is below:
Analyzing the Margin Response Message
- The MarginDetailResponseMessage class snippet is below:
- This consists of a payload object called MarginDetailResponse. This contains the point in time details, which will correspond to the date used in the request above.
- The list of PortfolioMarginDetail objects holds details of margin for the portfolio of trades that were submitted. A snippet of the PortfolioMarginDetail class is below:
Clearing the Redis Cache Load
- Users can manually clear the central cache if needed:
- Use clearCache to completely clear the cache.
- Data in the cache can be deleted by RPF loaded by using deleteFromCache.
- When loading the cache the default is to clear it first. To override this behavior set the property clearRedisCacheOnLoad to false.
- Refer to ExampleCacheLoad class in SDK.
- Manual cache clear is only supported for distributed cache, if using the local Redis cache simply create a new RiskAnalyticsService instance instead.
Freezing a Point in Time and Using Add Market Data
- The freeze point in time feature enables users to combine a non-CME risk array file with a CME SPAN or SPAN 2 risk parameter file (CME RPFs) with different cycles or business dates.
- This feature freezes the point in time and cycle of the non-CME risk array to match the CME risk parameter file’s point in time and cycle. The margin produced will return as of the frozen point in time using risk arrays across all supplied SPAN risk arrays.
- Please note margin results accuracy degrades for frozen risk arrays. The most accurate margin results are possible when a user matches all risk arrays and CME RPFs to the same date/cycle during initialization.
- Usage notes:
- The deployable software will discover the frozen point in time and cycle from inside the CME risk parameter file; provided when additionalSPANFiles are loaded with different cycles/PITs, either SPAN or SPAN2 RPF's.
- Users should specify the full new path for the non-CME SPAN file(s) they want to pass in using the argument -DadditionalSpanFiles=[local directory location] in batch start-up instructions, or use the extended API call addMarketData using the key additionalSpanFiles=[local directory location].
- Web app endpoint: {{API_URL}}/margins/addMarketData?AdditionalSPANFiles=cme=cladm/noncmespanfiles&inputType=RISKFNO
- The frozen calculation date is based on the CME RPF file business date and is meant to freeze a single calculation instance.
- Users will continue to use the API call MarketDataRefresh when point in time aligns across multiple RPFs but not the cycle.
- Users can only refresh freeze by following the typical execution time of SPAN cycles. For instance, YYYYMMDD.i to YYYYMMDD.c, not YYYYMMDD.c to YYYYMMDD.i
- Users can only refresh freeze by going forward in time on the calendar date. For instance, 20230130.c to 20230131.c, not 20230131.c to 20220130.c
- Users can only reload the most recent MA or MP RPF to freeze PIT. For instance, if a user initially loads YYYYMMDD.i and adds YYYYMMDD.c, without Refreshing the MarketData, and went directly to addMarketData, it will only freeze the PIT YYYYMMDD.i
- Portfolios supplied in any format will continue to pass to the calculator regardless of point in time or cycle supplied within the portfolio.
- The portfolio call must include the frozen point in time in the pointInTime field.
- User cannot load more than one SPAN risk array file with the same ClearingOrg.
- The freeze point in time feature cannot be used when the SPANFilesOverride feature is in use.
- The freeze point in time feature is only designed to work when a CME SPAN or SPAN 2 risk parameter file is supplied in the RPF directory.
Accessing the API: Best Practices
- Always use the converter factories provided to create a converter. This will mean that any internal change made within the converters will not cause any code changes for those developers that have already integrated with the application.
- Always use interfaces rather than class implementations directly. As any future changes made to physical classes can be hidden behind interfaces causing no change for existing integrations to the application.
Appendix
Inputs
Inputs for an futures and options portfolio will contain data definitions for the Risk Portfolio Message. The
Risk Portfolio Message structure will be organized by categories (Header, Point In Time, Portfolio, Entities, Positions, and Instruments) and further detailed through a subset of attributes within each category. For the full list of attributes, please refer to the SPAN 2 Risk Analysis Framework document available in CME CORE or https://www.cmegroup.com/confluence/display/EPICSANDBOX/SPAN+2+Risk+Analysis+Framework.
- Deployable Margin Software accepts two input formats: .json format and .csv format.
- Examples of .json & .csv formatting structure can be found in the CME CORE NR download center under the Java Deployable SDK section.
- Attributes will not be present in the input message if an optional field is left blank. The attached .json sample includes optional attributes just for clarity.
- If encoding JSON, data type "decimal" can be encoded as "string".
- Users can define multiple 'entity' blocks.
- 'Underlying Period Code' is a conditional attribute and applied to options only. If an option's values have multiple similarities (i.e. product code and period code), or if the user is using the deployable software to compute requirements for non-CME markets, then 'underlying period code' is required.
- A unique portfolio identifier is based on the fields: firmId, accountId, and originType.
Outputs
The Margin Results Message will be organized at various levels (Portfolio, CCP, POD, Product Group) and each level will contain further details for margin requirements, valuations, and sensitivities further broken down by currency when applicable. This structure will support results
for Futures and Options products margined through SPAN and SPAN 2 risk models. For the full
list of attributes and the full margin results data model, please refer to the SPAN 2 Risk Analysis Framework documentation available in CME CORE or }https://www.cmegroup.com/confluence/display/EPICSANDBOX/SPAN+2+Risk+Analysis+Framework+_.
- Deployable Margin Software returns a margin result in .json message format.
- Examples of .json message output can be found in the CME CORE NR download center under the Java Deployable SDK section.
Omnibus Combinations
OmnibusInd indicator defines the relationship between parent and child portfolios, here are possible combinations of account type, NetQty, NakedLong and NakedShort when Omnibus is equal to YES and NO.
AccountType | OmnibusInd | NetQty | NakedLong | NakedShort |
---|---|---|---|---|
SPECULATOR | YES | 0/null | ≥ 0/null | ≥ 0/null |
HEDGE | YES | 0/null | ≥ 0/null | ≥ 0/null |
SPECULATOR | NO | <>/null | 0/null | 0/null |
HEDGE | NO | <>/null | 0/null | 0/null |
MEMBER | NO | <>/null | 0/null | 0/null |
If OmnibusInd is set to YES, then the final NetQty must either be 0 or null also the Naked Long must be null or non-negative and Naked Short must be null or non-negative. The account type must be either HEDGE or SPECULATOR.
If Omnibus is set to NO, then the final NetQty must not be null and Naked Short must be null or 0 and Naked long must be null or 0.
The following omnibus edge cases will return warning messages:
- When a fully disclosed omnibus request (i.e. no positions) provides and ID but there are no children present. A warning message will occur, and the portfolio will not be returned in the margin response.
- If a child account and parentPortfolioId do not match an Omnibus ID.
- Different accounts with the same ID will return a warning for the account it is not aggregating to.
- Multiple omnibus parent accounts with the same Id will return a warning message.
Errors List
Notes about errors:
- ‘%’ pertains to injected error details which are not defined below.
- Errors will be numbered (i.e. ERR013_1, ERR013_2) when multiple positions in the same portfolio contain errors.
- Recommended remediation of error types
- Please follow the specified parameters here: SPAN 2 Risk Analysis Framework - Electronic Platform Information Console - Confluence (cmegroup.com)
- Reach out to Post Trade Services: posttradeservices@cmegroup.com
- Ensure the following fields are filled:
- CSV: currency, customerAccountType, omnibusInd, clearingOrganizationId, accountId, originType, netQty, nakedLongQty, nakedShortQty, exchangeId, productCode, productType, periodCode, strike
- JSON: businessDt, currency, customerAccountType, omnibusIndicator, clearingOrganizationId, firmId, accountId, originType, netQty, nakedLong, nakedShort, exchangeId, productCode, productType, periodCode
Error Type | Error Code | Error Message | Portfolio Type |
---|---|---|---|
NO_PAYLOAD | ERR001 | "Payload can't be null" | JSON |
NO_TRADES_POSITIONS | ERR002 | "No trades and/or positions provided" | JSON |
INVALID_INPUTTYPE | ERR003 | "Unsupported portfolio type supplied in inputType parameter: %s. Must be one of: %s" | CSV |
INVALID_CSV_FORMAT | ERR004 | "Issue whilst converting from csv format. See logs for more details" | CSV |
MARGIN_CALCULATION_ERROR | ERR005 | "Unable to process margin request, error during calculations: %s. Check logs for more information" | CSV |
MARGIN_AGGREGATION_ERROR | ERR006 | "%s" | N/A |
NO_RPF_SUPPLIED | ERR007 | "Unable to create the Analytics Service Implementation. No risk parameter file provided, or directory does not exist" | N/A |
INVALID_RPF | ERR008 | "Unable to properly initialize the risk analytics service. %s risk parameter file invalid: %s" | CSV JSON |
INVALID_POINT_IN_TIME | ERR009 | Invalid Error | N/A |
NO_RPF_FOR_POINT_IN_TIME | ERR010 | "Portfolio %s of type %s cannot be margined because the supplied point in time %s does not match any loaded risk parameter file" | CSV JSON |
RPF_INCORRECT_FORMAT | ERR011 | "Unable to determine cycle dates from %s – risk parameter file missing or in the wrong format. Check the | N/A |
INVALID_PORTFOLIO | ERR012 | "Position was ignored: internal SPAN exception while creation position" | CSV |
INVALID_POSITION | ERR013 | "Invalid position at line $d" General error used for all semantic issues. | CSV JSON |
INVALID_SPAN_POSITION | ERR014 | "Invalid position at line $d" | CSV |
NO_MODULE_CONFIG_FOR_RPF | ERR015 | "Unable to find configuration for provided risk parameter file: %" | N/A |
CANNOT_BE_ENCODED | ERR016 | "Unable to encode value" | N/A |
INVALID_TRADE | ERR017 | "Invalid trade at line $d" | CSV JSON |
NO_POINT_IN_TIME_SUPPLIED | ERR018 | "No point in time supplied" | JSON |
INVALID_TRANSACTION | ERR019 | "Invalid transaction at line $d" | N/A |
CANNOT_CONFIGURE_DIFFERENT_DATES | ERR020 | "Issue when attempting to load different versions of risk library for different dates" | N/A |
CANNOT_FIND_INSTRUMENT | ERR021 | "Instrument does not exist" | N/A |
UNABLE_TO_LOAD_CACHE | ERR022 | "Unable to load cache %s" | N/A |
FAILED_CREATE_RISK_DATA_READERS | ERR023 | "Failed to create risk data readers for file in %s" | N/A |
FAILED_TO_REFRESH_RPF | ERR024 | "Failed to refresh/reload risk parameter file" | N/A |
ERROR_IN_RL_MARGIN_CALCULATION | ERR025 | "Error calculating margin for position/s, please contact CME" | N/A |
INVALID_PORTFOLIO_CUSTOMER_ACCOUNT_TYPE | ERR026 | "Invalid portfolio customer account type" | CSV |
INVALID OMNIBUS | ERR100 | "Invalid | CSV JSON |
N/A | N/A | WARNING: Current configuration in RPF not compatible with deployable version. Overriding for backward compatibility. | N/A |
Lower-Level Warnings and INFO Messages
Message Type | Message Code | Message Details | Portfolio Type |
---|---|---|---|
DUPLICATE_OMNIBUS_PARENT_ACCOUNT | WRN028 | Duplicate omnibus parent account, aggregation may occur to this account | CSV JSON |
OMNIBUS_ACCOUNT_WITH_NO_POSITIONS_AND_NO_CHILDREN | WRN029 | Omnibus account has no positions and no child accounts - margin response not returned | CSV JSON |
OMNIBUS_PARENT_ACCOUNT_NOT_FOUND_FOR_CHILD_ACCOUNT | INF030 | No omnibus parent account found for child account | CSV JSON |
OMNIBUS_PARENT_ACCOUNT_HAS_NO_CHILDREN | INF031 | No child accounts found for omnibus parent account | CSV JSON |
FAILED_TO_RETRIEVE_SPAN2_DATA_FROM_RPF | WRN032 | Missing SPAN 2 product data in RPF. Check logs for more information | CSV JSON |
- Anything with a '%' in front of it gets injected with the actual details depending on the error.
- Error code "ERR013" is also used for any semantic futures and options position errors.
- Error codes are made up of a couple of different messages
- One way is the number of the field in error + "_" + the line number in the csv in error. For example, error code. ERR116_2 indicates ERR116 which is an invalid origin type field and is in line 2 in the csv input file.
- Another way is with a counter attached. For example, let's say you have a portfolio of 1000 positions and if there was a semantic validation error in each line then you could potentially have error codes from ERR017_1 to ERR017_1000.
Error codes coming out of the converter:
Error Type | Error Code | Error Message | Portfolio Type |
---|---|---|---|
REQUEST_ID | ERR101 | Invalid error | N/A |
VERSION | ERR102 | Invalid error | N/A |
SENT_TIME | ERR103 | Invalid error | N/A |
BUSINESS_DATE | ERR104 | "No point in time supplied" | JSON |
CYCLE_CODE | ERR105 | Invalid error | N/A |
RUN_NUMBER | ERR106 | Invalid error | N/A |
PORTFOLIO_ID | ERR107 | Invalid error | N/A |
CURRENCY | ERR108 | "Invalid value null in column currency: %s" | CSV |
CUSTOMER_ACCOUNT_TYPE | ERR109 | "Invalid value null in column customerAccountType: %s" | CSV |
OMNIBUS_INDICATOR | ERR110 | "Invalid netQty/nakedShortQty/nakedLongQty supplied for parentPORTFOLIO/omnibusIndicator combination" | CSV |
PARENT_PORTFOLIO_ID | ERR111 | Invalid error | N/A |
CLEARING_ORGANIZATION_ID | ERR013 | "Failed validation: No clearing organization id supplied: %s" | CSV JSON |
FIRM_ID | ERR013 | "Failed validation: No firm id supplied: %s" | JSON |
ACCOUNT_ID | ERR114 | "Failed validation: No account id supplied: %s" | CSV JSON |
ACCOUNT_NAME | ERR013 | Invalid error | N/A |
ORIGIN_TYPE | ERR116 ERR013 | "Invalid value null in column OrginType: %s" "Failed validation: Orgin type has to be one of CUST, CUSTOMER, HOUS, HOUSE: %s" | CSV JSON |
ACCOUNT_TYPE | ERR117 | Invalid error | N/A |
SEGREGATION_TYPE | ERR118 | Invalid Error | N/A |
NET_QUANTITY | ERR013 | "Failed validation: Final net quantity must not be 0 for non-omnibus accounts: %s" | CSV JSON |
NAKED_LONG_QUANTITY | ERR100 ERR013 | "Invalid netQty/nakedShortQty/nakedLongQty supplied for parentPORTFOLIO/omnibusIndicator combination" "Failed validation: Final net quantity must not be 0 for non-omnibus accounts: %s" | CSV JSON |
NAKED_SHORT_QUANTITY | ERR100 ERR013 | "Invalid netQty/nakedShortQty/nakedLongQty supplied for parentPORTFOLIO/omnibusIndicator combination" "Failed validation: Final net quantity must not be 0 for non-omnibus accounts: %s" | CSV JSON |
EXCHANGE_ID | ERR013 | "Failed validation: No valid exchange id supplied: %s" | CSV JSON |
PRODUCT_CODE | ERR014 | "Position was ignored: internal SPAN exception while creating position: %s" | CSV JSON |
PRODUCT_TYPE | ERR013 | "Failed validation: No valid product type supplied: %s" | CSV JSON |
PERIOD_CODE | ERR013 | "Failed validation: No period code supplied: %s" | CSV JSON |
STRIKE | ERR013 | "Failed validation: No strike supplied: %s" | CSV |
UNDERLYING_PERIOD_CODE | ERR131 | Invalid Error | N/A |
MEMO | ERR132 | Invalid Error | N/A |
Additional error message text which can be found after the ERROR_MESSAGE string:
- Failed validation: No point in time supplied
Failed validation: Cycle code must be one of \[ITD, EOD\]
- Failed validation: No valid currency supplied
- Failed validation: No account id supplied
Failed validation: Origin type must be one of \[CUST, CUSTOMER, HOUS, HOUSE\]
- Failed validation: No portfolio level customer account type supplied
- Failed validation: No firm id supplied
- Failed validation: Omnibus Account Type cannot be MEMBER
- Failed validation: No valid product type supplied
- Failed validation: No clearing organization id supplied
- Failed validation: No period code supplied
- Failed validation: No product code supplied
- Failed validation: No valid exchange id supplied
- Failed validation: No valid exchange id supplied: must be one of CBT,CME,CMX,COMEX,NYM,NYMEX
- Failed validation: No strike supplied
- Failed validation: Missing Series, could not populate underlying period code
- Failed validation: No PUT or CALL option type supplied
- Failed validation: Strike price should not be supplied for this instrument
- Failed validation: Option type should not be supplied for this instrument
- Failed validation: Underlying period code should not be supplied for this instrument
- Failed validation: Naked short quantity is not allowed for non-omnibus accounts
- Failed validation: Naked long quantity is not allowed for non-omnibus accounts
- Failed validation: Final Net must be non-null
- Failed validation: Final net quantity must not be 0 for non-omnibus accounts
- Failed validation: Final Net quantity is not allowed for an omnibus account
- Failed validation: Naked short quantity must be non-negative integer
- Failed validation: Naked long quantity must be non-negative integer
- Failed validation: Naked long and naked short must not both be 0
Release Examples
The below examples are intended to assist with new features and concepts introduced in deployable software, not to provide an exhaustive list of all features. For technical details, please review the "examples" provided in the software download package.
List of Start-up Arguments
Listed in alphabetical order.
Start-up Argument | Data Type | Required? | Description | Default |
---|---|---|---|---|
-DadditionalSpanFiles | string | N | User-supplied risk parameter file directory under suppled root directory. | N/A |
-DcacheMode | string | N | If "redis" informs deployable to run with redis data cache; else ignored. | N/A |
-DdontIgnoreSpanErrors | boolean | N | true = does not ignore errors from the SPAN library, fails to margin when errors from SPAN library occur. | false |
-DfilterByOrigin | string | N | HOUS = margins only portfolios where OriginType = House | N/A |
-DfnoRpfDirectory | string | N | User-supplied risk parameter file directory under suppled root directory (see -Drpf.source.path) | N/A |
-DignoreInvalidPositions | boolean | N | true = ignores invalid positions in request and passed only valid positions to margin | true |
-Dlog4j.configurationFile | string | N | User-supplied log4j configuration file, describing log level. | N/A |
-DomnibusPortfolioValidationDisabled | boolean | N | true = omnibus warning messages are not exposed in the margin response. | false |
-DriskLibraryDump | boolean | N | true = allows extra logging via arcDlApplication/data folder | false |
-Drpf.source.path | string | Y | User-supplied risk parameter file root directory. | ./rpfs |
-Dserver.port | number | Y | User-supplied server port. | 8082 |
-DspanFilesDirectoryOverride | string | N | User-supplied legacy SPAN file location (see Risk Parameter Files above). | N/A |
Initialization of RiskAnalyticsService
RiskAnalyticsServiceFactory.createRiskAnalyticsService(…) takes an object of type RiskConfig.class. This class has a builder internal class. For the purposes of the following examples let's assume a file structure like the below:
-Users ---someUser -------rpfDirectory -------20190527_FNO -------20190528_FNO -------20190529_FNO ---otherFolder ---someOtherFolder
//If passing just the rpfDirectory, it will try to pick the latest RPF available under that folder. In this case the RPF under 20210119_FNO will be selected.
Example:
RiskConfig = new Builder() .rfpDirectory("/Users/someUser/rfpDirectory") .build(); RiskAnalyticsService riskAnalyticsService= RiskAnalyticsServiceFactory.createRiskAnalyticsService(riskConfig);
//If passing the rpfDirectory plus the fnoRpfDirectory it will select the specific folder inside the rpfDirectory. The folder 20190528_FNO will be selected. Example:
RiskConfig riskConfig=new Builder() .rfpDirectory("/Users/someUser/rfpDirectory") .fnoRfpDirectory("20190528_FNO") .build(); RiskAnalyticsService riskAnalyticsService= RiskAnalyticsServiceFactory.createRiskAnalyticsService(riskConfig);
//validationDisabled (default: false) à if true completely disables the semantic validation. This flag does not affect the converter syntax validation.
Example:
RiskConfig riskConfig=new Builder() .rfpDirectory("/Users/someUser/rpfDirectory") .validationDisabled(true) .build(); RiskAnalyticsService riskAnalyticsService= RiskAnalyticsServiceFactory.createRiskAnalyticsService(riskConfig);
//ignoreInvalidPositions (default:false) à if true ignores invalid trades picked up by semantic validation and margins the good ones. This is the validation before the trades are sent to be margined. Syntactically invalid positions picked up by the converter are handled using the converter API described below in the section 'Using RiskFnoToRiskConverter'.
Example:
RiskConfig riskConfig=new Builder() .rfpDirectory("/Users/someUser/rpfDirectory") .ignoreInvalidPositions(true) .build(); RiskAnalyticsService riskAnalyticsService= RiskAnalyticsServiceFactory.createRiskAnalyticsService(riskConfig);
//dontIgnoreSpanErrors (default:false) à if true then it will throw exception when errors are returned from the margin calculation. When false the errors will simply be added to the response.
Example:
RiskConfig riskConfig=new Builder() .rfpDirectory("/Users/someUser/rpfDirectory") .dontIgnoreSpanErrors(true) .build(); RiskAnalyticsService riskAnalyticsService= RiskAnalyticsServiceFactory.createRiskAnalyticsService(riskConfig);
//riskLibraryDump (default:false) à if true it will add extra logging from inside the native calculation.
Example:
RiskConfig riskConfig=new Builder() .rfpDirectory("/Users/someUser/rpfDirectory") .riskLibraryDump(true) .build(); RiskAnalyticsService riskAnalyticsService= RiskAnalyticsServiceFactory.createRiskAnalyticsService(riskConfig);
//spanFilesDirectoryOverride - will override the marketdata folder span files that are part of the standard SPAN/SPAN 2 rpf file installation. When margining historical SPAN only files, the override folder must be used along with an existing/installed SPAN rpf ( cme.span.yyyymmdd.\[s.c\].zip ). This folder can also be used to add in cross-margin and/or inter-exchange span files (e.g. MGE, OCC).
Example:
RiskConfig riskConfig=new Builder() .rfpDirectory("/Users/someUser/rpfDirectory") .spanFilesDirectoryO verride/"Some/override/directory") .build(); RiskAnalyticsService riskAnalyticsService= RiskAnalyticsServiceFactory.createRiskAnalyticsService(riskConfig);
Using RiskFnoToRiskConverter
//with errors being returned as a list. It will still return a riskPortfolioRequestMessage.
Example:
//with exception being thrown when error is found. Exception of type RiskAnalyticException is thrown with information about the error. Flow is interrupted.
Log File Parameter
Deployable uses SLF4j which allows users to plugin desired logging framework. To help users understand logging framework, there is an example log4j xml file supplied in the examples/arcdl-example directory under src/main/resources. This can be changed to use whatever logging level required and packaged into users' tech stack with deployable jar by changing the "Root Level=" field per the below list of log levels.
A second option is to supply a separate log xml file and then this can be loaded into deployable.
- Include a log4j2.xml file configuration file within the packaged JAR. (note: The highlighted code is where the output file properties, such as file name and logging pattern, can be modified).
- To supply the parameter at runtime.
Log Levels
Level | Description | Default? | Perf Test Log File Example* |
---|---|---|---|
DEBUG | Highest level logging, includes details on calculation engine order or operations. | N | 14,700 KB |
INFO | Standard logging, some calculation details and all errors | Y | 8,400 KB |
WARN | Errors only. | N | 6,200 KB |
ERROR | Very limited logging produced | N | 1 KB |
OFF | No logging is produced | N | N/A |
*Perf test conditions:
Initialize SPAN-only risk parameter file (2/2/22 point in time) + MGE 2/2/22 EOD pa2 file
Load portfolioA – 13,500 positions with errors, no margin expected
Load portfolioB – 12,700 positions, multiple errors, margin expected
Reducing the Size of the Log File
To enable debug mode on the logging levels, you will need to download the following log4j file:
In this file find <Root level="INFO">, you can change that to DEBUG (at the bottom of the file).
- Then place this in the same location as the arcdl-web-example.jar.
- Then in your start script before the jar name use this parameter:
- Dlog4j.configurationFile=log4j.xml
Performance Metrics
Baseline Test Case Description:
- 1.5m positions across 15k accounts, 100 positions average
- Testing performed using CME-produced SPAN and SPAN 2 RPF files, not using legacy SPAN files via SPAN Over-ride feature.
- Optimized Hardware:
- 8 cores, 10 worker threads
- 64gb RAM
- Total Max memory usage: 31 GB
- A Redis process is running locally on our test boxes that is used for caching; this could be run remotely on a separate box but will introduce some latency.
- Different memory usage metrics:
- JVM memory is the memory being used by the Java Virtual Machine when running the app
- Redis memory is the memory being used by the Redis cache
- Total memory is the entire amount of memory being used by the box. This includes the JVM memory, memory used by Redis cache as well as the OS.
- Different memory usage metrics:
- Performance Test Case Results (v2.0):
Request Type | Total Time (mm:ss) | Time per portfolio | Portfolios per second | Max JVM memory usage (max vm mem) | Redis memory usage | Max total memory usage |
---|---|---|---|---|---|---|
SPAN | 5:20 | 21.3ms | 46.8 | 12 GB | n/a | 25 GB |
SPAN 2 | 8:19 | 33.3ms | 30 | 12.7 GB | 10.8 GB | 32 GB |
Threads | SPAN (HH:MM:SS) | SPAN 2 (hh:mm:ss) |
---|---|---|
1 | 00:42:00 | 01:09:30 |
10 | 00:05:20 | 00:08:30 |
How was your Client Systems Wiki Experience? Submit Feedback
Copyright © 2024 CME Group Inc. All rights reserved.