Hacker Newsnew | past | comments | ask | show | jobs | submit | qigtyofhp's commentslogin

The story was interesting but the title is misleading. This wasn't the first hedge fund. Benjamin Graham started his first fund in the 1920s which would be what we call a hedge fund today. Graham's fund might not be the first hedge fund but it came before Jones'.


Straight from wikipedia[1]:

During the US bull market of the 1920s, there were numerous private investment vehicles available to wealthy investors. Of that period, the best known today is the Graham-Newman Partnership, founded by Benjamin Graham and his long-time business partner Jerry Newman. This was cited by Warren Buffett in a 2006 letter to the Museum of American Finance as an early hedge fund, and based on other comments from Buffett, Janet Tavakoli deems Graham's investment firm the first hedge fund.

The sociologist Alfred W. Jones is credited with coining the phrase "hedged fund" and is credited with creating the first hedge fund structure in 1949. Jones referred to his fund as being "hedged", a term then commonly used on Wall Street to describe the management of investment risk due to changes in the financial markets.

In other words, it depends on who you ask and what your exact understanding is of what defines a hedge(d) fund.

[1] https://en.wikipedia.org/wiki/Hedge_fund#History


This reminds me of a story (I don’t know the authenticity of it) where someone before Black-Scholes had invented the Black-Scholes model but didn’t publish rather they were making ton of money by putting it to work.


That would be Ed Thorp.

https://en.m.wikipedia.org/wiki/Edward_O._Thorp

He wrote a book this great book:

https://www.amazon.com/Man-All-Markets-Street-Dealer/dp/0812...

He’s almost 92 and still around. Here’s a 2022 Tim Ferriss interview:

https://youtu.be/gs39QFYIbBY?si=mEABrrb7yDwHR6sK


It’s also interesting that Thorp met a young Ken Griffin and gave him some advice/materials in starting out.

I don’t think a young Ken Griffin of today would get that same access.


There's also a great story that speaks more towards the detriment of using Black-Scholes as an investment thesis.

https://www.amazon.com/When-Genius-Failed-Long-Term-Manageme...


Doubt it. If you go to market with Black-Scholes you will lose money.

https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&d...

"Why We Never Use the Black Scholes Equation" - https://youtu.be/UoGlUZPNouM


You would make money because the other tools in use at the time were much worse and people were pricing options poorly. So plenty of opportunities. Hence why Ed thorp did well.

It will not work today. For starters, it makes a bunch of simplifying assumptions. And there are better models. It also misses a number of important dynamics.


"Warren Buffett: Black-Scholes Formula Is Total Nonsense" - https://youtu.be/H4X-4e7fPgI


Jones’ is normally referred to as the first hedge fund because he used/invented shorting.

Before him, if you wanted to make a negative bet on a stock you could really only do it with spread betting.


That’s not true. Shorting equities goes back to the East Indies Company and is often blamed for crises going back at least to the late 1700s.

Jacob Little was a giant Wall Street shorter in the 1830s.


The answer is that aw jones was the first to use the long short hedging strategy. That is why his fund is called a hedge fund and is considered the first of its kind. Others have used shorting before but he was the first to use the strategy of specifically entering a short position to protect a different long position. Well, he was the first one to officially theorize it and market it, at least. That is why when carol loomis described his fund, she coined the word hedge fund.


That is exactly what I was going to post.


The attention mechanism started as a simple trick to not use recurrent neural networks.

Read the intro in the original paper "Attention is all you need" (https://arxiv.org/abs/1706.03762)

This video explains the drawbacks to RNNs and how transformers solve that: https://youtu.be/S27pHKBEp30?t=394

Andrej Karpathy explains attention here: https://youtu.be/kCc8FmEb1nY?t=3719

He explains how attention is seen as a communication network: https://youtu.be/kCc8FmEb1nY?t=4298


> Read the intro in the original paper "Attention is all you need"

I wouldn't call this the original "attention" paper. Definitely not the first paper to use the phrase. If you want clear proof of this, let's read the paper

> Attention mechanisms have become an integral part of compelling sequence modeling and transduction models in various tasks, allowing modeling of dependencies without regard to their distance in the input or output sequences.

I do think a lot of people's lack of understanding of attention is because they are so focused on DP(S)A that they miss a lot of the broader picture. And math. Not enough people dig into the math.


This is an interesting feature. But you don't need to use just OpenAI's embeddings. You can generate your own embeddings with open source SOTA transformer models which would probably work just the same. You could generate a couple hundred thousand embeddings with a rented A100 for less than 2 dollars. And the point of converting text or other objects (like images) into embeddings is to compare a large number of documents to a source document very fast. It's more useful to put the embeddings in something like Redis. This pgvector data type would be good for an offline backup of vectors.


> And the point of converting text or other objects (like images) into embeddings is to compare a large number of documents to a source document very fast.

But a "source document" could also be a natural language query that you need to convert to an embedding - if you want to enable natural language search on your corpus? (maybe along with handling queries in French getting good semantic hits in English etc?)?


Do you have any examples that come to mind for this? I'd love to understand what other models exist and what Redis extensions exist to compare embeddings.


This will load the transformer models from Hugging Face (their models have a similar architecture to OpenAI embedding models): https://www.sbert.net/docs/pretrained_models.html

Redis has approximate nearest-neighbors vector similarity search: https://redis-py.readthedocs.io/en/stable/examples/search_ve...

Generate the embeddings on a rented GPU, push to Redis then do a similarity search. Store vectors in Redis using ndarray.tobytes()


Does redis let you do some kind of similarity based lookup? Is that new?


It's not. Managers lie about performance of their employees in order to either save themselves or save their favorites in the group. I'm a current Amazon employee.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: