Hacker Newsnew | past | comments | ask | show | jobs | submit | jnieminen's commentslogin


I would spent a little more and the the Business Basic plan. I think it is quite a good deal.

https://www.microsoft.com/en-ww/microsoft-365/business/micro...


I will use that I think. It is a bonus to get some web Office apps in addition.


There is a group policy to "turn off windows consumer experience". To edit local group policies win 10 pro is needed. It is also possible to change the register value. See the link for more info.

https://www.prajwaldesai.com/turn-off-windows-10-microsoft-c...


You could have the 2TB iCloud storage add-on and also the Apple One with 2TB to get the total of 4TB.


That's only available in a handful of countries (ones that have Apple News or Apple Fitness+ available), and not where I live.


The retina screen and the fantastic touchpad.


AWS and Azure are a permission to spend.


Are they really though? A serverless event driven architecture system I’m working on literally costs less than £10 a month on Azure. Running full blown VMs instead of cheaper more appropriate technologies like containers or functions will always cost more.


As per our calculation for CloudAlarm [0], as we reach a few hundred users, it'd be cheaper to use a dedicated instance than serverless (Azure Functions) design. So it may vary from system to system depending the amount of work you perform for each user.

0: https://cloudalarm.in/ – btw, you may wish to have daily budgeted pace based alerts using it – to inform you when the usage spikes up (much faster than Azure's consumption threshold based alerts).


I was playing with the Azure "free" tier. Even I tried to be extremely careful with it, after a while noticed that I had left a storage blob for a VM hanging around and some external IPv4 address. I will continue to use Hetzner online for my own stuff instead running this on "public cloud".


You could also use Github pages or Cloudflare pages to host.


The nice thing about using Cloudflare in front of a real host is that you can still do dynamic pages. You can instantly purge any file from the Cloudflare cache, so all you have to do is purge anything that changes and your users see the update instantly, while your backend only sees a couple of extra requests.

My site updates some data every 10 minutes or so, I don't think that would work with GitHub Pages. Maybe you could do something with Cloudflare Pages combined with Workers, but Workers have a limited free tier. The normal Cloudflare CDN scales to infinity for free.


How is your experience with restoring? Have you had expensive data retrieval costs?


I used to backup with Arq to S3 and the restores were very fast. With Glacier, Arq initiates the data retrieval and waits for file availability in the UI. Glacier can take a day or so, laptop needs to be open and connected to internet during the time for the transfer to complete. Given I'm primarily backing up external drives, I find this UX not an issue, but if you wanted backups with instant accessibility, backing up to S3 really isn't that pricy.


There are high endurance SD cards, but eventually those will have also IO problems.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: