13 June 2015

Is human head transplant research unethical?

What are the ethical arguments against research in head transplants being carried out by Dr. Xiaoping Ren in China?
 
There are ethical considerations based on the objections of animal rights activists. I don't discount them, but I'm not well-versed in their terminology.

The ethical considerations as extended to humans are most obvious to me. Even if such a thing as a head transplant were possible, which is extremely unlikely given the complexity of reattaching a severed spinal cord, what are the criteria for body donors?
 
Here is the ethically challenging scenario: You have two people, 
  • one with a paralyzed body but a healthy head 
  • the other with a healthy body and a healthy head. 

03 June 2015

How can a knowledge economy work in practice?

How can a knowledge economy work in practice, with very few non-technical, lower-skilled niches for the workers?

A knowledge economy with very few non-technical lower skilled opportunities cannot work in practice. A knowledge economy is a service economy. The US is a knowledge economy now. It is not working so well.

Germany is still a manufacturing economy. It is not broken. Yes, there is the entire EU import-export drama which contributes to Germany's relatively greater prosperity; however, Germany did not outsource all its manufacturing and many of its highly-skilled worker positions (including software development and pharmaceuticals R&D) to developing nations, unlike the USA.

More education is not a panacea for economic growth. Here's a recent discussion of that, https://www.project-syndicate.org/commentary/education-economic-growth-by-ricardo-hausmann-2015-05 although it makes incorrect observations about productivity being mostly driven by primary school and high school education. Worker productivity is not the issue at all. In fact, reputable studies indicate that worker productivity has never been so high.

Capitalism can work, but needlessly complicated tax laws allow the wealthiest individuals and corporations to stash trillions of dollars in offshore accounts. (Profits soar to previously unrecorded heights; a few CEOs reap unprecedented benefits as the middle class in much of the developed world erodes). It is becoming increasingly difficult to operate a small business other than basic services in the USA, due to inconsistently applied and burdensome rules and regulations.

Is a 1500 microsecond clock cycle low enough latency to do high frequency trading?

My answer to On a modern high-end machine, how many clock cycles is 1500 microseconds?

The article says that the BATS exchange gives preferred customers access to stock pricing data with 1 microsecond latency, whereas everyone else, i.e. those who don't pay a LOT extra, have delays of up to 1500 microseconds in receiving the same pricing data. There is an SEC requirement, Regulation NMS that states that everyone is supposed to have equal access to market information. (Under Reg NMS, which is incorporated into each exchange’s operating policies and subscriber agreements, exchanges are required to distribute information on “terms that are fair and reasonable” and that are “not unreasonably discriminatory.”)

Apparently, there is a lot that one can do if one has a 1500 microsecond (1.5 millisecond) advantage over the rest of the market, when it comes to high frequency trading (HFT), and perhaps with algorithmic trading in general.


Judge Forrest further explained that, in promulgating Regulation NMS, the SEC had interpreted federal securities law to require the exchanges to transmit data to processors at the same time that the exchanges submit this data to the preferred proprietary feeds. Judge Forrest highlighted an SEC statement during rulemaking that provided that exchanges are not required to “synchronize the delivery of its data to end-users.”
There are lots of good definitions of latency here, "Achieve Ultra-Low Latency for High-Frequency Trading Applications" https://www.cisco.com/c/dam/en/us/products/collateral/switches/nexus-3000-series-switches/white_paper_c11-716030.pdf [PDF] that you might find helpful. It describes the orders of magnitudes required for HFT from the server side and signal propagation/switching, both of which are measured in units of nanoseconds, not microseconds, and certainly not milliseconds.

Diving down into the comments


Anon   That's what I was attempting to point out is that 1.5 ms is not that long even on a high end machine. A single hash table look-up using a one of the fastest hashing algorithms is min. 12 clock cycles (proprietary algorithm) to even be able to know what happened in a trade would require at least 4 such lookups that is 48 clock cycles to just know what happened. The data communications likely eats up and other 20 or 30 clock cycles. Add in the time to record any decision (about an other 10 to 20 clock cycles) and we are now at a remainder of about anywhere from 2 clock cycles to 25 clock cycles. The OS would likely eat 5 or so of those so owe are left with at most 10 integer operations (2 to 3 floating point), assume no branching, assume fetch from RAM not from register. What on earth can be done in that amount time? .... so that latency there is not an issue and he judge is right to dismiss the case.... btw I personally think HFT should be banned.

Me   The issue in the Lanier v. BATS exchange is not the machine processing time as much as the delivery time, I think.

Adam Helps   Um, modern machines work in fractions of nanoseconds, not microseconds, and you can assume that someone who cares about HFT is going to be obsessive about using those clock cycles optimally.

A 4 GHz machine (which is high end consumer grade hardware these days) executes 4 billion clock cycles per second. In one microsecond, that's 4,000 instructions. In 1500 microseconds, that's 6,000,000 instructions.

True, at this scale, memory fetches take 50 full clock ticks, but someone doing HFT is going to make sure they hit the cache almost every time. And I think you can do quite a lot of thinking in 6,000,000 instructions.

Oh, and I completely agree that HFT is a worthless parasite on our markets, is providing virtually no value, and we'd be better off banning it. Some truly brilliant people have devoted a lot of time and effort to making it work, and I'd like the market to encourage them to find something more beneficial to do.

Me   This is true, that nanoseconds are the latency units of concern in HFT. This has been true for some time now, I think. I wrote this back in 2012, CFTC regulatory hubris but even then, 2 milliseconds was a huge savings. Have a look at this too: HFT - How to define and measure latency? "When people say they have achieved millisecond or nanosecond latency, which two points is that between?"

Adam   Fascinating articles! I work in industrial design, not finance, but I do use C++. It seems like a lot of brilliant C++ programmers are in finance, and I get recruiter emails from financial companies occasionally. The salaries are very high, so I can see why people are attracted to it, but it does seem a little sad somehow.

Me   You have a wonderful and wise perspective. I'm so happy you liked those two articles! I used to work at IBM, modeling storage product performance before I worked in finance. I'm a mathematician not a programmer though; I used APL and SAS for all my programming; I only wish I knew C++. I don't even know what object-oriented programming is. I write awful code :o) I am fascinated by HFT but also appalled by the way in which it undermines the actual intent of financial markets. It is a tech hack that should be illegal. HFT reminds me of Uber, which I believe is a regulatory hack rather than technological innovation. The allure and money of HFT, and the interest it generates is very hard to resist... no one ever gets excited about network latency outside of HFT and algorithmic trading!

Adam   Object oriented programming is just a label slapped onto a bundle of programming concepts that were all fashionable at the same time, so it's difficult to define succinctly anyway. The problem it tries to address is that one tends to write the exact same algorithms over and over again, with only minor variations, and yet it's very difficult to reuse your old code. It's wasteful. Major highlights were separating interface from implementation, reducing visibility of internal details, and "inheriting" old code into a new class to allow changes without starting over.

The shiny new thing in C++ these days is templates (aka generics). They're screaming fast (HFT guys love 'em) and they fulfill the promise of code reuse elegantly. Sadly, their syntax is a Byzantine nightmare of random punctuation that makes parsers weep for mercy. One can't have everything, I suppose.

Me   Thank you for explaining that, quite nicely! I used to write subroutines (in Pascal maybe...?) I thought of them as being analogous to modular furniture, which I would use over and over again, sometimes in new programs by copy and pasting, sometimes by calling them from older programs that I had written already. SAS has things called macros (different than the macros used in photography) that can be used to save the effort of rewriting blocks of code, although they aren't quite the same as my subroutines. I never took any real programming courses in school, so I am a perfect example of why self-instruction can be a terrible way to learn!

Anon   Your forgetting about context switches and stuff like that.... the smallest time slices on any OS I know of are 1 microsecond.

Adam   It's not that I'm ignoring context switching, it's that context switching doesn't matter. In an optimal real-time setup, the CPUs will run one thread each and never switch contexts at all. Context switching doesn't have anything to do with the rate at which instructions are executed, it just refers to how often CPUs can jump from executing one thread to executing a different one.

I did ignore multi-core systems, though. Eight cores gives you 48,000,000 instructions in 1.5ms, but they mostly aren't allowed to talk to each other (or must use best effort lock-free methods), and staying in the cache becomes even more critical.


CFTC Regulatory Hubris 


This certainly doesn't look good, as far as ensuring access by all to capital markets. Well, not unless you happen to be running a high frequency trading (HFT) business. Algos Needed to Surveill Algos, O’Malia Says,  https://www.securitiestechnologymonitor.com/news/omalia-algos-to-surveill-algos-cftc-30788-1.html
High-frequency trading accounts for the majority of trades in equity markets, which are overseen by the Securities and Exchange Commission. But automated trading now is approaching 50% of trading in futures contracts, as well.

What is the best way to implement regulatory oversight of a market driven by black boxes, dark pools and program trading? Make regulation an automated system too!
The regulation of derivatives markets, which extend from futures and options to credit default swaps, needs “totally automated systems” to oversee markets, going forward, O’Malia said.

The lowly Low Frequency Trader


One market, O’Malia said, is populated by low frequency traders that focus on “macro factors” like available supply, monetary policy and financial statements.

The low frequency trader, who is associated with less and less of all market activity, concerns himself with arcane matters such as volume, economic policy and the financial statements of the companies they invest in.

No need for Certified Public Accountants, or industry analysts who are subject matter experts in manufacturing and technology companies... what do value investors know, anyway? In contrast,
the HFT market focuses on “market microstructural factors” such as “rigidities in price adjustments across markets” and “variations in matching engines". The only way we can develop the appropriate level of surveillance is through the deployment of algorithms and automation.

Variations in matching engines is the future, well, it is for those with access to 8.5 millisecond round-trip trade execution from New York to Chicago, see "CFN Xcelor 2 millisecond time savings" Strip Down Symbols. Save Two Milliseconds Each Way. https://www.securitiestechnologymonitor.com/news/cfn-xcelor-two-millisecond-time-savings-30783-1.html Most of us can forget about the NYSE and NASDAQ, as that would be no place for the Low Frequency Trader, nor any non-HFT retail investor.

Is HFT innovative?


The 2012 Annual CFA Society Conference was held last week." Some Call It Innovation": Much of what we call "financial services innovation" is merely dealer bookmaking.

Non-financial services

Now THIS is innovative! Delta Airlines will operate its own oil refinery. Delta says it won't sell fuel from its refinery on the open market.  https://www.reuters.com/article/2012/06/18/refinery-operations-delta-monroe-idUSL1E8HI2Y320120618/ This link still works!

Delta Air Lines Inc, which took the bold step of bidding for a refinery to keep a handle on fuel costs, said it would not be selling jet fuel on the open market once the deal closes. "We will produce and sell the jet fuel to ourselves," said Eric Torbenson, a spokesman for the airline, the second largest in the United States. Monroe Energy, the Delta subsidiary formed to own the refinery, will sell the fuel back to Delta.

That's "out of the box" thinking! Delta Air will purchase its own refinery, and no longer need a fuel hedging trading desk, focusing instead on the core business of running an airline, and its verticals.

I worry about the farmers. The CME Group (whose Chicago Mercantile Exchange is overseen by the CFTC) serves the needs of many who are not speculators. Agricultural commodities futures, and options-on-futures, are essential for farm businesses of all sizes, as insurance against price volatility.

Big Data?

You knew it had to be squeezed in there somehow.
Expanding our surveillance to include order data will require additional hardware to store and sort a massive amount of data.
In what form though? High volume, rapid processing of transactional data is not well-suited to MapReduce and Hadoop

[1] All quoted passages are from Securities Technology Monitor "CFTC's O'Malia Says Algo's to Surveill Algo's", 18 June 2012 unless otherwise indicated.

Describe a little-known and remarkable fact about the CIA

The CIA fostered and promoted American Abstract Expressionist painting around the world for about 30 years, and was remarkably successful in doing so.[1] 

The decision to include culture and art was made when the CIA was founded in 1947.  In 1950, the International Organisations Division (IOD) of the CIA was set up. It subsidized the animated version of George Orwell's Animal Farm, sponsored American jazz artists, many opera recitals and the Boston Symphony Orchestra's international touring program.

Rebutting the idea of America as a cultural desert

During the 1950s, Senator Joe McCarthy's hysterical denunciations of the avant-garde and unorthodox were deeply embarrassing. They discredited the idea that America was a sophisticated, culturally rich democracy.

Why Abstract Expressionism?

Jackson Pollock was one of the artists whose work the CIA helped to promote. Summertime Number 9A via Art Crimes on Flickr
It was recognized that Abstract Expressionism was the kind of art that made Socialist Realism look even more stylized and more rigid and confined than it was... Moscow in those days was very vicious in its denunciation of any kind of non-conformity to its own very rigid patterns.[2]
One could accurately reason that anything the USSR criticized that much and that heavily was worth supporting.