Hacker Newsnew | past | comments | ask | show | jobs | submit | wesleyy's commentslogin

Rent seeking refers to economic rent, which is not the same thing as "housing rent". It has a very specific definition when used in the context of economic rent seeking. Renting out houses for people to live in is not rent seeking.


According to certain economic philosophies, all rent on land (but not buildings) is economic rent.


Exactly, you can even rent seek by buying a plot of land and intentionally keeping the number of units below what demand would suggest. E.g. you have a plot with enough demand to house 20 people but you only build a single family house for 4 people (2 parents 2 children) denying 16 people the possibility to live there. The land is still priced as if it could support 20 people and you make your money off of the sale of the land. This is what people buy into when they buy their house as an "investment". They bank on population growth and increasing demand. If upzoning is impossible then you gain nothing from selling because you have to pay taxes and then buy another property that costs the same amount (which you cant afford because of the taxes), turning the real estate wealth into paper wealth that has to be paid (your mortgage) for but is difficult to draw from except via reverse mortgages.


Economic rent takes its name from and is a generalization of land rent; land rent isn't just an example of economic rent, it is the paradigmatic example.


No, that's what the lightfield does. You see different physical images depending on your angle to the screen


Fascinating. So not only is it feeding it an 8K / 30 (60?) FPS image, it's feeding it numerous incident angle variations and displaying all of them simultaneously?

Sounds like a monster data rate.


I think that is where the custom compression algorithm comes in. If you think the fact that human body and face doesn't change much, and the fact that it's a 3d model based, the compression ratio could be very high.


Good point. Also the very neutral background would contribute to that.


I bet they'd also just fix focus on the person and whatever they're holding, then blur out the background in most cases.


I only know what I saw from the IO stream, but I think it might send a compressed 3D mesh + texture across the network and render the light field locally.


I think what they are transferring is not a video but 3d model and the skin texture applied on the model (all derived from the realtime video / depth recording on the other side). The receiving and then renders it as a 3d model on the screen.


https://news.ycombinator.com/item?id=23316225

> the 8K is their Input Resolution. > That resolution is then divided into the 45 viewing directions:


Is existing Looking Glass Factory tech the same though? Not so sure about that. Those displays are typically monitor-sized at the largest and not really aimed at displaying a live feed of a person. This looks to be a more seamless experience on a larger screen.


Downvoted, with no response, for posing an open question. Shame on you, honestly.


Sounds like eye tracking could still be useful to not bother with images for angles that are 100% not visible at the moment.


How does any kind of fine for PG&E not end up being a regressive tax? As long as they are the only supplier in NorCal the costs will just be passed on to all NorCal residents.


Because the rates they charge customers are highly regulated. PG&E is simply not allowed to raise rates to pay fines.

Also, they are not the only supplier in NorCal, SMUD and many community choice energy organizations exist.


Where will the money come from to pay the fines?


They could take it out of the profit portion of the rates, or cut executive pay. But knowing PG&E they will cut back on maintenance, maybe sell a power plant; most likely creditors will pay then lose out the next time they file for bankruptcy.


> But knowing PG&E they will cut back on maintenance

That would be a decision made by the government of California, not by PG&E. Their budget has to be approved by the state.

There were some fun articles back when they were blamed for fires about how PG&E had spent the last several years trying to allocate money to equipment maintenance only to have their budget rejected whenever they tried.


> That would be a decision made by the government of California, not by PG&E. Their budget has to be approved by the state.

Oh, like PG&E would never commit fraud to conceal bnot doing what it is legal obligated to do with regard to safety.

I mean, its not like its been convicted of 91 federal and state felonies in two separate incidents producing substantial property damage and death which involved exactly that behavior.

The PUC controls (some of) what PG&E is legally obligated to do, but it doesn't control what PG&E actually does.


> but it doesn't control what PG&E actually does.

Except for the part where they control what PG&E is and isn't allowed to spend on?

AKA we're not controlling what you do, jus your ability to do it.


> Except for the part where they control what PG&E is and isn't allowed to spend on?

Once again, PG&E’s 91 recent state and federal felony convictions demonstrate that controlling what the company is legally allowed to do with resources is not the same thing as controlling what the company actually does with resources.


You say this like nobody's ever been punished for doing exactly what they were told to do.


Huh, a private corporation where their budget is approved by state and the state meddles to prevent it from investing in critical infrastructure.

That sounds like the worst of both worlds.


At this point the state might as well just take over the company as it sounds like they have essentially done so through regulation anyway.


But then there won’t be a fall guy when things go wrong.


Well, yes, that would make more sense.

I assume the idea behind the current setup is that having the state own a power company would be "communist", whereas having a "private" power company for which the state determines the areas it serves, the rates it charges customers, and how it spends its money is "not communist".


Never attribute to anti-communism that which can be more easily explained by crony capitalism.


Profits will be reduced?


Hopefully shareholders, ultimately.


Do that enough times and nobody will invest in California's anything.


The property I just bought in northern CA was a trip. In the full title report I got documents dating back to 1880 - all hand written. I got to read about various families that owned the land, and transferred ownership to various companies, including "Northern California Power Company Consolidated" in 1917 and eventually "Pacific Gas and Electric Company" as of December 10th 1948.

It is a total mind-fuck how tied up we are with that company.

But, I do have some hope that at some point in the future mini-grid technology will replace these large juggernauts.


Fines are probably harder to deliberately mislabel as capex, and therefore less likely to get approved by the CPUC to be passed on to ratepayers. That said, I think these proposed fines are silly. PG&E did actually try really hard to get the word out, burying folks under a mountain of mailers and text messages and social media posts and advertising.

Pointing to medical baseline customers as some ultra-vulnerable population and claiming that PG&E has extra responsibility to keep their power on feels disingenuous. Those are precisely the users who should be most keenly aware of their uptime requirements and have their own backups provisioned.

It felt weird writing that. It makes me sound like a PG&E apologist. Changing focus...

PG&E has optimized its business to reduce transparency and accountability, while also engaging in creative accounting to drive up costs and improve their take under a cost-plus model. Read the Camp Fire report (it's well-written and engaging) to get a better picture of the firm and its habits: https://www.buttecounty.net/Portals/30/CFReport/PGE-THE-CAMP...

(As an aside, reading that report, it's clear that Californians surely share some of the blame here. Overzealous conservationist regulation has made it unnecessarily difficult to do infrastructure maintenance. You can't do work in one season because the weather makes it dangerous or you're not allowed downtime. You can't do work in the other season because there's nesting birds on your poles. Or you can't cut down that protected heritage tree. And things like CEQA make building new unreasonably difficult, so you have to live with what you've got.)

Prior to 2014, the CPUC was run by a dude soliciting bribes from those he regulated. So it's not surprising that the organization isn't doing a great job of putting consumer interests first. Corporate culture flows from the top: https://www.sfgate.com/bayarea/article/CPUC-head-Michael-Pee...

PG&E has taken all of us for a ride, and has managed to do it despite a huge degree of regulatory oversight. We pay some of the highest rates for electricity in the United States, through some of the most complicated rate plans, and still don't manage to keep the lights on or avoid setting fires. And the rate of price increases make healthcare and higher education almost look sensible. See slide 28: https://www.cpuc.ca.gov/uploadedFiles/CPUCWebsite/Content/Ne...


Github is already a bazel package management system though? If the package is a bazel workspace all you need to do is add a http_archive rule pointing to that github repo


That would work if, like golang, bazel was the "default" package manager for everyone. Right now it's not easy to get, for example, vulkan or muslc or qt as a bazel package.

It's also not easy to publish a version of your package (A) that depends on another package (B). This would create a diamond-problem like situation where your package (C) depends on both packages (A->B, C->A, C->B). So, some code needs to resolve these issues and reproducible identify the exact hashs of everything to pull in to make it a not-manual process.

Also, something great about the design docs linked in my other post: there's a presubmit.yaml standard so, pulling in a library, will include tests that bazel will run for whatever arch you're compiling for. For instance, say you pull in sqlite and need to build it for RISC-V. Before you just needed to hope that sqlite worked correctly on your arch, now you'll be able to test those situations in CI with RBE runners for all architectures.


> That would work if, like golang, bazel was the "default" package manager for everyone. Right now it's not easy to get, for example, vulkan or muslc or qt as a bazel package.

I agree, but I don't think a "Bazel management system" would solve this issue, because the problem is people buying into bazel in the first place

> It's also not easy to publish a version of your package (A) that depends on another package (B). This would create a diamond-problem like situation where your package (C) depends on both packages (A->B, C->A, C->B). So, some code needs to resolve these issues and reproducible identify the exact hashs of everything to pull in to make it a not-manual process.

This is a good point. However, I think realistically, effort would be better spent currently on making it easy to bazelize existing code. I have (unfortunately) never been able to pull an external library without manually bazelizing it, and this only actually ends up becoming a problem when bazel picks up enough momentum in OSS that you are likely to find a external library that is already bazelised.


How do the existing repos that do have this dependency structure solve the problem? For example there are loads of packages that depend individually on Abseil. If my package uses Abseil and it uses tcmalloc, it also uses Abseil by way of tcmalloc, but in practice this does not seem to cause trouble.


Each dependency appears as a "repository" so as a bazel target it will look like "@<thing>//some/target:file". Everything refers to a dep by it's workspace/repository name and exports a `repository.bzl` or `workspace.bzl` file that your WORKSPACE file `load()`s and calls a function in.

This is pretty standardized and "recommended" by bazel-federation: https://github.com/bazelbuild/bazel-federation

A good example of how this becomes a massive mess: https://github.com/bazelbuild/rules_docker/blob/master/repos...


It does seem a bit high touch, but don't I also have the alternative of just cloning third party code into my repo and bazelizing it myself? I've certainly seen that done, and it's what Google does internally as well.


It's possible and what I've done quite a bit when using bazel but it makes code sharing very difficult. I think the internal desire, from google, is likely between tensorflow and cloud wanting to ship code easily to the OSS world. One of the reasons PyTorch is taking off is because people can build it easily!


Not every package (especially core system packages, like zlib/openssl/glibc/...) are on GitHub and want to pull in Bazel buildfiles into their source tree. As such, there's no guaranteed canonical-upstream-repo:buildfile-repo mapping, so you need some way to organize, keep track of what's where, and make sure things work well together.


The point is that there is no idle fee. The legislation just states the minimum required wage for time and miles spent with a passenger, as well as a $5/trip minimum.


> While stocks may have some intrinsic value, it is commonly acknowledged that their value as assets is substantially lower than their stock market price. Moreover companies continue to buy their own stocks further reducing the value of stocks as assets.

This is not really relevant. Companies have larger market caps than their intrinsic assets because people trade on discounted cash flow.

> Any small players (yes that means you and me) can easily lose their shirt in a down turn as we do not have reserve assets or other means (hedge funds) and are not able to participate in the next hand.

1. There's no mechanism that forces you to "call" a bet in investing. It's not winner takes all. 2. Why can't you allocate your bond/equity split such that you do have reserve assets?


I don't think Google's implementation of their hermetic par files are open source.


interesting how University of Waterloo was never mentioned given its heavy ties with YC, and that it was his main university.


I would love to apply for an internship position but my god is that career site awful. Mid-way through applying it just does nothing but returns "Error You have encountered a system error. We apologize for the inconvenience."


We make cars, but perhaps we could improve our career website. Thank you for the feedback.

We're looking for people who see problems (like the one you experienced), along with efficient and elegant solutions to fix them. Perhaps you'd make a great intern. If there's anything specific we can do to help you with your internship inquiry, please reply or provide feedback to the site manager.


If you're interested in hiring somebody who can fix your career site for you, let me know. My email is in my profile.


Most likely the latter


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: