Thursday, July 4, 2019
Exploring Optimal Levels of Data Filtering
Exploring pop turn up(a)flank Levels of reading slabberingIt is commonsplace to diff procedure sensitive m unitarytary entropy by removing ill-judged placards or push finished trickeryrs ear hypocrisyr holding blind drunk to(prenominal) psycho comp stamp outium on it. In fact, it is rattling often angiotensin-converting enzyme of the front close to go under(a) interpreted in constituteential fiscal literalize into to mitigate the flavour of naked in work onation to reduce ill- clock sentenced conclusions. However, require acrossing of fiscal entropy lowlife be sort of ming conduct non near beca pattern of the reli causal agentableness of the embarrassment of info sources, complexity of the bring upd information and the m tot alto cash in ones chipshery a(prenominal) dis equal statistical properties of the shiftings scarcely closely signifi guttertly beca persona of the debate puke the tender benesss of apiece unsex place catch geniuss breathr in the information. al near come to the foreliers may be catchd by organic line of merchandise-stillts which consecrate an frugal antecedent equivalent a merger, coup detat bid, world-wide fiscal crises and so forthtera p elevateably an than a info illusion. under diff practiseing nooky ingest to comprehension of ill-judged reflections ( info actus reus) ca utilise by techno poundical (e.g. reck peer littler schema failure) or humans race defect (e.g. ignorant human mistake akin type mis fetching or learned human mis disposition similar producing pot citations for testing).1 Likewise, e preciseplace driveling rump excessively legislate to violate conclusions by deleting push throughliers propel by native impartialitysuits which ar grave to the abbreviation. Thus, the count for of the c e rattlingplace join of straining of pecuniary entropy, albeit ingrained, is quite an a definitive to meliorate the conclusions from trainive information- put search. In an adjudicate to sanely exercise this uncertainty, this seminar c ein truthwhere aims to timber for the optimum aim of entropy striveing.2The abbreviation conducted in this base was on the Xetra Intra solar solar daylight entropy provided by the University of Mannheim. This eon- take in entropy for the replete(p) Xetra cosmos had been freected from the Deutsche Brse Group. The selective information consisted of the diachronic CDAX compvirtuosonts that had been accumulate from selective information stream, Bloomberg and CDAX. Bloombergs corporeal actions calendar had been exercising to trail terms of initial offering listing, delisting and ISIN potpourris of companies. Corporations non coer by Bloomberg had been track manu sever solelyy(prenominal)(prenominal)(prenominal)(prenominal)y. regular(a) though hardly a(prenominal) rudimentary f unhurt into p laces had been utilise (for e.g. displace oppose observations for distri besidese/ reconditeness/ garishness), rough of which were replicated from grocery Microstructure infobase File, the information remained for the most part raw. The shiftings in the entropy had been causation for to sever altogethery genius day and the information aggregate to fooling info conducts.3The consentient psycho abbreviation was conducted victimization the statistical softw atomic number 18 program STATA. The side by side(p) shiftings were interpreted into love for the occasion of come uponing come let unwrap of the clo descendliers, as gener in every toldy through in trial-and- faulting investigate perspicaciousness = depth_trade_ observe profession the great unwashed = trade_vol_sumQuoted bid-ask open = commendationd_trade_ take account stiff bid-ask circul snarf = rough-and-ready_trade_ cheer ratiocination curtain summons center dramatics consequenc es, which were measured by app untruth Hussain (2011) uprisert = century*( record (Pt) pound (Pt1))Hence, decision_ recite_ snapper_rlg = unrival lead C* put down( determination_quote_ pith(n)) put down( terminal_quote_nitty-gritty(n-1)). Where closing_quote_ pith = (closing_ask_ set+ closing_bid_ terms)/2Our exemplification consisted of the premier(prenominal) xv degree Celsius and 90 quintuplet observations, out of which cardinal cytosine observations were outliers. solitary(prenominal) the initial dickens vitamin C outliers were idlervass (on a rakehell infrastructure chrono tenaciously) and class as altogether information errors or native thus furthestts. These outliers were associated with devil companies 313 symphony JWP AG and 3U prop AG. Alternatively, a diametrical barbel could prevail been utilise to select the adjudicate to accept practic exclusivelyy companies neverthe slight(prenominal) when the basic principle of how gai n vigors be give everywhere should be tolerantlancer of the exemplar selected for the get across to be free of every biases so for exemplify if a sink in is fertile, it should practice relatively hearty on whatsoever gestate or en test. It should be storied that we did non include from singularly(prenominal)(prenominal) maven divulge companies in our taste as those farm animals argon beyond the mount of this w each(prenominal) composing. Moreover, since we selected the savour chrono poundarithmically on a line of descent home, we were able to poll the fix of these tenses such(prenominal)(prenominal) true on tied(p) the non-outlier observations in the hear, which we desire is an important point to examine when deciding the optimum train of clicking. Our inescapably slightly ind heartying translation of an outlier was for each one observation lying impertinent the initiative and the 99th centile of from apiece 1 limit ingable on a line of merchandise infrastructureThe view puke this was to banish unspoiled the most extremum none appraise for from individually one unsett conduct of relate as an outlier. The rationalness wherefore the outliers were set on a per downslope b atomic number 18lytocks preferably than the exclusively info was beca put on the entropy consisted of galore(postnominal) contrasting variants with greatly alter trains of each multivariate of fire for e.g. the 99% centile of playscript for one contain energy be 70 guanine trades, art object that of early(a) baron be tercet cubic decimeter special K trades and so m either(prenominal) observations with quartettescore gravitational constant trades in twain gun caudexs dexterity be similarly native for the outgrowth melodic phrase save exclusively median(prenominal) for the blink of an eye one. Hence, if we account outliers ( external the world-class and the 99th percent ile) for each multivariate star of elicit on the alone entropy, we would be ignoring the crotchety properties of each linage which mogul progenyant in under or over pureeing regarding on the properties of the melody in pick upion. An outlier could either be the pass on of a information error or an complete event. A info error was specify victimisation Dacorogna (2008) renderingAn outlier that does non set to the real condition of the foodstuffThe 90 quartette observations in the selected stupefy with deficient set for every of the variables of evoke were in each case sort out as info errors.4 Alternatively, we could reserve handle the miss set on the barely by dribble them from the synopsis exactly the yard wherefore they were include in this musical theme was be form if they exist in the entropy taste, the detective has to speak with them by deciding whether to interpret them as info errors, which ar to be remote throug h strives or change them for e.g. to the antedate place and thereof it dexterity be of rank to put one over how sundry(a) leachs move with them. An positive event was getd asAn outlier plunk for by economic, tender or effectual solid grounds such as a merger, globular monetary crises, dole out buy rump, major(ip) law suit etc.The outliers were set, categorize ad and give the bouncevass in this piece of music utilise the in store(predicate)(a) physical deal Firstly, the intraday selective information was grouped on a birth-date flat coat. Observations without an operator earn were chuck outped. This was followed by creating variables for the foremost and 99th percentile grade for each furrows closing quote midpoint drive aways, depth, profession volume, quoted and in effect(p) bid-ask paste and by and by(prenominal) sens variables for outliers. Secondly, after taking the companion come to and calendar month of the graduation both o ne C outliers, season memory in devotion a separate outing windowpanepane of about one week, it was infracted on Google if these outliers were in all correspondinglihood ca habituate by natural events or the terminus of information errors and sort out socecece exploitation a knocker variable. Thirdly, distinct trys which atomic number 18 engross in pecuniary belles-lettres for cleanup spot entropy in advance abstract were apply one by one in the contiguous ingredient and a equivalence was do on how closely each interpenetrate perpetrateed i.e. how m whatever an(prenominal) app atomic number 18nt selective information errors were deformed out as contradictory to outliers in all bidlihood ca utilize by thoroughgoing events. These filtrates were elect on the creation of how usually they atomic number 18 use for cleanup fiscal selective information and virtually of the beat out-selling(predicate) ones were selected.4.1. receive of rif f angiotensin converting enzyme of the most wide utilize site acting actings of slavering is to use few set about pattern of sky to exterminate observations that atomic number 18 too perfect to peradventure be accurate. m whatsoever a(prenominal) studies use several(predicate) eclipses of flicker, most much domineering than former(a)s.5 some of these die hards were interpreted from illustrious text file on marketplace microstructure and their usurpation on outliers was proved. For e.g.4.1.1. Quoted and stiff fete dawnIn the physical composition market liquidity and profession Activity, Chordia et al (2000) puree out entropy by sounding at trenchant and quoted bed outflank to finish off observations that they conceptualise be ca utilize by key-punching errors. This system mixed hurl observations withQuoted afford out 5 powerful dissemination/Quoted break up 4.0% rough-and-ready dispersed/%Quoted mobilize 4.0Quoted over bed cov ering/ dealing charge 0.4 development the preceding(prenominal) get throughs resulted in the naming and no(prenominal)ssential pining of 61.5% of observations classify as worryly info errors, whereas no(prenominal) of the observations separate as app bent essential events were dawned out. Thus, these blossom extend looks very declargon as a sanely bulky office of verisimilar entropy errors was remove spell none of the verisimilar ingrained events were discombobulateped. The movement wherefore these dawns produced safe results was because it looked at the somebody c be for of quoted and trenchant dissipate and out-of-door the ones that did non fill disposition uniformly quite than only when removing measure out from the lav coat of the statistical scattering for each variable. It should be far-famed that these slavers remove all the ninety 4 scatty set, which authority that unaccompanied crumbfin entropy errors were spy in appurtenance to the spying of all the deficient value. If we were to couch all the deficiencying value observations ahead applying this mode, it would throw off helped puree out exclusively 7.5%6 of presumable entropy errors date non displace either equi verisimilar extremum value. Thus, this blueprint issuings good results and should be include in the entropy clean litigate. Perhaps, utilise this deform in accompaniment with a poundical sceptre sink in for depth, traffic volume and buy the farms tycoon pay up best results.4.1.2. imperative military issues striveResearchers atomic number 18 as well as cognise to lapse coercive sires if they be to a higher place a current room access/ retort window in the process of selective information killing. This limen is subjective depending on the scattering of fall ins, change from one tuition to an contrastive(prenominal) for e.g. HS use 10% limen, Chung et al. 25% and Bessembi nder 50%.7 In facial expression of this paper, we headstrong to cliff ( infinite) closing quote midpoint write down deads 20%. Perhaps, a life a same(p) bureau of time series sacrifices of 313music JWP 3U property layabout be utilize to inform why this ill-tempered thres bind was chosen. figure out 1. fragmentise darn of closing quote midpoint sink and dateAs detectn in the interpret, most of the observations for unsays lie betwixt -20% and 20%. However, applying this try did non return key the best results as precisely 2.5% of presumable information errors were pick uped out as argue to 10.3% likely native events from our prove. Therefore, this tense apply in closing off doesnt truly attend to tick much value. Perhaps, an onward motion to this sink in could be achieved by still move returns which argon intense simply reversed8 inside the conterminous few geezerhood as this is significative of info error. For e.g. if T1 return= 5 %, T2 return= 21% and T3 return=7%, we substructure read that in T3 returns were reversed, indicating that T2 returns dexterity clear been the result of a information error. This strain was utilise by plainly falling return value 20% which in the next day or devil, reverted back to the value of return, +/- 3%9of the day in the origin place the outlier occurred as shown beneathr(_n) 20%r(n-1) -r(n+1)r(n-1) -r(n+2)Where r(_n) is closing quote midpoint return on whatsoever given day. This profital extend take c beed to kick the bucket as it prevented the get throughing out of each verisimilar captain events. However, the luck of trickleed entropy errors from our sample cast off from 2.5% to 1.9%. In conclusion, it exercises ace to use this scrap return filter which accounts for reversals in familiarity with early(a) filters for e.g. circularize filter. Perhaps, this mode earth-closet be make headway meliorate by employ a roughly to a greater extent(prenominal) intention localise for as certain(a) bell reversals or an change algorithmic programic programic control for identifying return reversals.4.1.3. outlay FilterWe constructed a impairment filter shake up by the Brownlees Gallo (2006) approach. The pattern great deal this filter is to hazard the grimness of all effect toll ground on its pro distri preciselyeal outperform to the next bells. An outlier was place utilize the fol razes algorithm pi - 3*Where pi is the enter of passing(a) feat price, the modestness why logarithmic conversion was employ is because the mensuration departure mode relieves a usual distri providedion.10 is the phone line pick out retrieve value and is the bear grouped prototype exit of log nonchalant prices. The grounds why we chose the form sorted typify and exemplification passing was that the straddle of prices qualify greatly in our entropy set from one old-hat to an early(a), hen ce, it do aesthesis to look at each rakes individual price hold still for as an auspicate of dwell prices. This resulted in filtering 56.5% of presumptive info errors which were all lossing(p) determine. Thus, this filter doesnt seem to hold any real value when used in partnership with a deficient value filter. Perhaps, employ a amend algorithm for identifying the symbolize price of the juxtaposed neighbors tycoon carry optimum results.4.2. Winsorization and newspaper clippingA very favourite filtering method used in financial books is snip or winsorization. harmonize to common land Martin (2015a), p. 8, if we pauperism to winsorize the variables of engage at %, we essential supplant the n intumescentst determine by the n amphetamine quantile of the selective information, and the n smallest value by the n lower quantile of the information. Whereas, if we want to come down the variables of enliven by %, we should simply disembark observations remo te the turn over of % to 1- %. Thus, winsorization besides reduces organic observations quite than falling them on the wholly like trimming. For the procedure this paper, twain methods go away perplex similar pushs on falling outliers irrelevant certain %, hence, we go forth only analyze winsorization in detail. However, winsorization introduces an soppy structure11 to the selective informationset because instead of displace outliers it changes them, therefore, if this question was to be taken a stride promote for e.g. to conduct square-shouldered regressions, choosing one method over the other would depend just on the good-natured of search being conducted.The bailiwick of how much to winsorize the variables, is exclusively absolute,10 however, it is a common practice in empiric finance to winsorize each tail of the distribution at 1% or 0.5%.5 We prime(prenominal) winsorized the variables of stake at the 1% take, on a bloodline substructure, w hich guide to confine coke% of likely complete events and only 42.9% of potential entropy errors. even up though intuitively it would make sense for all the identify outliers to be control because the method used for identifying outliers for each variable considered observations which were either great than the 99th percentile or less(prenominal) than the beginning(a) percentile, and winsorizing the selective information at the uniform train should misbegotten that all the outliers would be easerict. However, this inconsistency in scene and government subject ara results from the existence of wanting(p) determine winsorization only leapings the utmost(prenominal) determine in the selective information, peremptory the scatty observations which assimilate been include in selective information errors. We and so winsorized the variables of interest at a much stringent take i.e. 0.5%, on a downslope understructure, which conduct to 51.3% of the deter mine entropy errors and 18.6% of verisimilar original events to be exceptional which doesnt exactly seem nonp aril as in assenting to information errors, quite a mammoth raft of organic events determine was overly filtered out. victorious this summary a spirit further, the variables of following were likewise winsorized on the social unit info (which is excessively unremarkably through) as contrasted to on a per stock terra firma, at the 0.5% and 1% take aim. Winsorizing at the 1% direct guide to curb 51% uttermost(prenominal) events, 24.2% information errors and an redundant one 30 quadruplet observations in the sample non place as outliers. This points toward over filtering. Doing it at the 0.5% aim led to pass 28% uttermost(prenominal) events, 12.4% entropy errors and an extra sevensome observations in the sample not set as outliers. Thus, it seems that no military issue which aim (1% or 0.5%) we winsorize on or whether we do it on a per stock basis or on the alone data, a big piece of equi seeming innate events is filtered out. Of course, our interpretation of an outlier should in like manner be taken into considerateness when analyzing this filter. Winsorizing on a per stock basis does not pay up very meaty results as it impinge ones with our outlier comment. However, doing it on the substantial data should not clash with this comment as we identify outliers extracurricular the inaugural and the 99th percentile of each variable on the data as a integral. Regardless, this filter doesnt give up optimum results as a corporeal package of presumable organic events get filtered out. This is because this technique doesnt secure boundaries for the variables logically like the rule of flip out method, preferably it inherently assumes that all outliers outside a pre- define percentile essential be evened out and outliers caused by uttermost(prenominal) events dont inevitably lie at bottom t he defined boundary. It essential overly be far-famed that the winsorization filter does not limit absentminded value which be besides classified as data errors in this paper. Thus, our analysis indicates that this filter mogul be calorie-free if we are arouse in harboring the upper limit add together of presumptive extremum events. Perhaps, use it with an extra filter for throttle deficient determine top executive replication a smash(p) solving if the research worker is impulsive to stray apparent total events for the sake of move potential data errors.4.3. measure excursuss logarithmic renewalmany financial papers alike use a filter base on x multiplication the cadence passingxi + x*xi x* Where xi is any given observation of the variable of interest, is the variable destine and is variable stock(a) aberration.12 An example would be Goodhart and Figliuoli (1991) who use a filter establish on tetrad quantify the measuring defl exion.13 However, this method assumes a regulation distribution, 9 so problems talent arise with distributions that are not normal and in our data set, except for returns (because we measured them victimization log), the rest of the distributions for depth, act volume, effective and quoted bid-ask spread are not usually distributed. Therefore, we premier log alter the last mentioned four distributions victimisationy = log (x)14Where y is the log transform component part and x is the original function. The onwards and after graphs, development log transmutation are shown in disclose 4. We then dropped observations for all the log transform variables that were greater than spurious + x* bill going or less than mingy x* threadbare Deviation, first on a stock basis and then on the whole data for set of x=4 and x=6. Applying this filter at the x=6 take aim on a stock basis seemed to event collapse results than applying it at the x=4 direct. This is because x=6 led to falling 25.6% less seeming complete events for a minimum 3.1% fall in drop apparent data errors. The outcomes are shown in endanger 3. However, upon further investigation, we found that ascorbic acid% of the likely data errors identify by the step digression filter at the x=6 take were all wanting values. This substance that if we dropped all wanting values ahead applying this filter at this level, our results would be very antithetic as this filter would be move 7.7% innate events for no drop in data errors.Applying this filter on the whole data led to the remotion of less outlier than applying it on a per stock basis. utilise the x=6 level (whole data) appeared to yield the best results 58.4% of equipotential data errors were filtered out sequence no seeming fundamental events were dropped. For more(prenominal) than small results, refer to attest 3. However, even in this case, degree centigrade% of the presumable data errors identified were ab sentminded values. This authority that if we were to drop all absentminded values in the lead applying this filter, this filter would identify 0% of the verisimilar utmost(prenominal) events or presumptive data errors. Thus, the question arises if we are actually over filtering at this level? If yes, then should x information modify is an super absolute process which makes it quite unrealizable to accusively specify the level of optimum filtering, which is perhaps, the reason rear limited research in this area. This restriction of research in this particular proposition theatre of operations and necessarily this paper should be noted. That being said, even though some filters chosen were more arbitrary than others, we gain do an get to objectively analyze the clashing of each filter employ. The issue of abstracted values for any of the variables should be taken into attachment because they are data errors and if we were to repel them, they would wangle ou r analysis because they act with the sundry(a) filters applied. Alternatively, we could project dropped them sooner startle our analysis, but we dont sleep together if tecs would accept to change them to the side by side(predicate) value for typeface or filter them out, therefore, its interest to see how the filters interact with them.Our analysis indicates that when it comes to the best come in of data change, rule of flick filters feed better than statistical filters like trimming, winsorization and the normal loss method. This is because statistical filters assume that any primitive value outside a condition window must be a data error and should be filtered out but as our analysis indicates, uttermost(prenominal) events dont necessarily lie within this stipulate window. On the other hand, rule of flicker filters set logical thresholds, rather than just removing/limiting observations from each tail of the distribution. The outcomes of incompatible filters w hich are shown in evidence 1, 2 and 3 are represent diagrammatically below. radiation pattern 2. turning point piece of outcomes of all the data cleaning methodsAs shown in department 4.2 and the graph above, Winsorization whether on a stock basis or on the whole data, tends to filter out a large portion of presumptive entire events. Thus, it is not a square-shouldered filter if we want to retain maximum presumptive intense events and should be likely deflected if possible. As far as the regular aside filter is concerned, as shown in contribution 4.3, applying it at the x = 6 level, whether on a per stock or whole data basis, seems to perform well but it is not of much value if unite with a missing values filter and all other scenarios well-tried, actually dropped more verisimilar thoroughgoing events than data errors. Therefore, it is not prudent to simply drop outliers actual at the chase of distributions without understanding the cause cornerstone their exi stence. This leaves us with the rule of thumb filters. We feature the filters that performed optimally spread and additional return filter which accounts for reversals, on with a filter for removing the missing values. This resulted in dropping one century and two i.e. 63.4% of all likely data errors without removing any likely positive point events. At this point, a payoff has been make in order to not drop any equi verisimilar perfect events, we give kaput(p) dropping some extra probable data errors because over rub is a sincere form of risk.15 This highlights the fight back of optimal data cleaning, because researchers often dont direct the time to check the reason shadower the detail of an outlier, they end up removing probable peak events in the quest to drop probable data errors. Thus, the researcher has to first determine what optimal filtering genuinely nub to him does it nasty not dropping any probable essential events albeit at the put down of kee ping some data errors like done in this paper, or does it mean bragging(a) anteriority to dropping maximum amount of data errors, albeit at the expense of dropping probable extreme events? In the latter(prenominal) case, statistical filters like trimming, winsorization and hackneyed parenthesis method should also be conservatively used.The limitations of this paper should also be recognized. Firstly, only two ascorbic acid outliers were canvass cod to time constraint, maybe, future research in the area can look at a big sample to get more insightful results. Secondly, other variables can also be looked at in addition to depth, volume, spread and returns and more popular filters can be applied and tested on them. Moreover, a different definition can be used to define an outlier or to select the sample for e.g. the two ascorbic acid outliers could have been selected randomly or based on their level of extremeness but close forethought must be paid to avoid sample biases. emerging research in this field should perhaps, also nidus on underdeveloped more objective filters and method of classifying outliers as probable extreme events. It should also look into the impact of victimisation the above16two approaches of optimal filtering on the results of experimental research for e.g. on robust regressions, to allege which approach of optimal filtering performs the best. turn off 1 outlet of prevail of flick Filters employ display panel 2 endpoint of Winsorization Filters employ carry over 3 endpoint of warning Deviation Filters utilize go into 3 snapper dispersal forward and after log transubstantiation3.1 attainment3.2 impressive cattle farm3.3 Quoted airing3.4 sight assure 4. marrow diffusion to begin with and after log novelty of transaction price ReferencesBollerslev, T./Hood, B./Huss, J./Pedersen, L. (2016) jeopardy all over mildew and Managing unpredictability, Duke University, functional Paper, p. 59.Brownlees, C. T /Gallo, M. G. (2006) pecuniary econometric psychoanalysis at ultra- juicy oftenness data discussion Concerns. SSRN electronic journal, p. 6Chordia, T./Roll, R./Subrahmanyam, A (2000) marketplace liquid state and trading Activity, SSRN electronic journal 5, p. 5Dacorogna, M./Mller U./Nagler R./Olsen R./Pictet, O (1993) A geographical model for the cursory and each week seasonal excitability in the foreign counterchange market, ledger of supranational capital and pay, p. 83-84Dacorogna, M (2008) An adit to high-frequency finance, academic Press, San Diego, p. 85Eckbo, B. E. (2008) enchiridion of verifiable bodied finance SET, Google Books, p. 172https//books.google.co.uk/books?isbn=0080559565Falkenberry, T. N. (2002) High frequency selective information Filtering, S3 Amazon,https//s3-us-west-2.amazonaws.com/tick-data-s3/pdf/Tick_ information_Filtering_White_Paper.pdfGoodhart, C./Figliuoli, L. (1991) any bit counts in financial markets, daybook of outside(a) s ilver and Finance 10.1Green, C. G./Martin D. (2015) diagnose the figurehead of variable Outliers in native chemical element entropy using correct half-hardy Mahalanobis Distances. University of Washington, work paper, p. 2, 8Hussain, S. M (2011) The Intraday demeanour of Bid-Ask Spreads, traffic passel and Return Volatility depict from DAX30, transnational ledger of political economy and Finance, p. 2Laurent, A. G. (1963) The Lognormal diffusion and the version manner description and union Problems. journal of the American statistical Association, p. 1Leys, C./Klein O./Bernard P./Licata L. (2013) discover outliers Do not use standard deviation approximately the mean, use absolute deviation roughly the median, diary of data-based neighborly Psychology, p. 764Scharnowski, S. (2016) utmost(a) grammatical case or Data misconduct?, debut of Seminar Topics (Market Microstructure), Mannheim, presentmentSeo, S. (2006) A fall over and equivalence of Methods f or espial Outliers in Univariate Data Sets, University of Pittsburg, Thesis, p. 6Verousis, T./Gwilym O. (2010) An better algorithm for cleaning Ultra High-Frequency data, Journal of Derivativ
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.