Hey thanks (I'm the author)! BTW the "Pro" version has the electrochromic dimming, so I recommend paying a little extra for that unless you're really sure you're not going to need it.
EDIT: To clarify, I meant the "Xreal Air 2 Pro", not the "Xreal One Pro". The latter are much more expensive.
They are $299 on sale on the vendor website right now. I won't link because I don't want to promote them necessarily, but I think you must have seen a different vendor or something?
I've been thinking about using xreal glasses for coding but all the reviews I've seen seems to think that the fidelity isn't good enough for reading text for lengthy stretches of time. This article is the first counter argument here.
I don't get why people still use postman when you have nice open-source tools such as Bruno [0], which actually can do a lot of what postman does, and more than that you can even import your postman collections.
Thank you so much for sharing this. We're actively looking for alternatives to Postman right now, and would be heavily inclined toward an OSS solution.
<pre><code> LabVIEW 2020+
Windows 10+
Git
And tortoise git (for its embedded diff tool)
</code></pre>I'm a big fan of tortoise and mainly its revision graph. I must say their 3-way merge tool is the best free software on Windows the only competing one, but less good, is p4merge, and it's closed source.
Also Tortoise is one of the big reasons I did not switch to MacOS at work (yes, the revision graph, and no, there are no almost-as-good-or-better alternative on Linux/MacOS, but please prove me wrong) .
TIL about LabVIEW and the G programming language. Also it breaks my mental image of NASA people working on Linux or MacOS.
Just anecdotal, but I've met a few hundred NASA contractors (and am one, and work on the field) and I'm not aware of anyone ever using MacOS as their primary OS. The laptops are all issued by the government (or a prime contractor but that's not so different in the end) and I've never heard of them issuing MacBooks
LabVIEW is really fantastic because it’s really easy to throw lab software together in a few hours or days and just get hardware test stands off the ground, especially when you don’t have a SWE in your department and you have an engineer who just wants to get it working and doesn’t want to get bogged down in code. It’s also pretty easy to make changes to even if you have limited software dev experience. Sure, there are many projects where you really want to have the flexibility of traditional programming languages and have actual SWEs work on it, and the proprietary license is annoying, but it makes a lot of sense when you see non-SWE engineers and techs working with it on the lab floor.
Edit: By the way I’m aware that there are LabVIEW specific SWEs as mentioned in the article who are able to do wizardry with it, but I wanted to highlight its usability beyond that.
This completely differs from my own experience with LabView, which I used a number of times in both undergrad/some grad-level coursework (I have a mechanical engineering background), as well as in internships at a couple of different companies. LabView sits, almost uniquely, in the "absolutely not, with no exceptions, ever again for the rest of my life" tools that I've worked with in my career. I don't think I even list on my resume anymore, because I don't want anyone to know that I've ever touched it and assume I'd be willing to again.
I know it's a classic "don't blame your tools!" situation, but the ability for even moderately-experienced programmers to accidentally build high-incidental-complexity tooling that becomes a nightmare to re-learn once you've lost your mental model of the program is, in my experience, unique (and frightening).
I once spent weeks trying to get a LabView-based tool up and running that a senior engineer in another section had written. Sketching out the relationships between components, documenting I/O, etc. After finally giving up the ghost, I went to that engineer for help. After spending hours (like, 5-6 hours, not 1-2) sitting next to him in my lab, he said "yeah, I'm not really sure what I was doing with this...", and proceeded to need to take the entire program back to his desk for nearly a week before he could finally explain how it worked.
This situation wasn't a one-off; it's happened with nearly every non-trivial codebase that I've ever touched that used it. In my experience, LabView is really fantastic in only two situations:
a) Very simple GUI-based DAQ tools that the person who wrote the program, and them alone, will need to use
b) Complex tools that are owned by a team of engineers who have written LabView for years and will now be dedicated exclusively to those tools
You missed a lot of its value, which the parent commenter highlighted.
Agreed, it's terrible if you have to maintain something from before, especially as over the years they get bad as one engineer after another adds or fixes one thing or another. It's terrible for that, and those codebases and tools should have been migrated to python etc. long ago.
It is great for getting a piece of hardware working and being tested today. In the next 2 hours. Sometimes you just don't have thr time or money in the budget to write custom code for everything. Sometimes you just need to make it work and move on. Labview is great for that.
> I must say their 3-way merge tool is the best free software on Windows the only competing one, but less good, is p4merge, and it's closed source.
A long time ago, I used Araxis Merge[1] and I can strongly recommend it[2]. It was specifically better than both tortoise git and p4merge, after having used both of those options personally.
Indeed it seems to be a very good diff tool, plus the fact the license is for life.
But this is not the killer feature that will make me replace tortoise (in fact you can even configure tortoise to use an external diff tool different from theirs).
The killer feature for me is the revision graph [0], and even if tortoise is open source, I can't find something good enough on Linux/MacOS to approach the features of that said revision graph. But once again, please prove me wrong.
It reminded me that story I saw on internet where companies wanted to help give meals for poor people and most of them gave money, except for Toyota, which improved the chain of supply by applying the methodology they use themselves to build their cars. And basically they improved by ~40% the number of meals delivered per day.
I think the first AIs were trained on human-only generated code and the quality was quite ok. Now AI has already produced a significant quantity of less-than-ok quality code that humans spread across all the different code hosting sites.
That said quantity of AI-generated code "polluted" the training sets of all future AIs, because AI auto-feeding itself cannot give good results. I don't see how the quality can improve.
I have another simple catch, how the energy harvested by a Dyson Sphère is supposed to be transported to the planet where people live ? I suppose a direct cable is out of question, and I don't think wireless eletricity is possible, so what does remain ?
Batteries charged and after transported from the Dyson Sphere to the planet, and when empty, returned to the Dyson Sphere ?
reply