Hacker News new | past | comments | ask | show | jobs | submit login
Exploring the new .NET “dotnet” CLI (hanselman.com)
193 points by Nelkins on Jan 7, 2016 | hide | past | favorite | 126 comments



In case you missed: "dotnet compile --native" is basically like the Go compiler, it statically links all the DLLs and produces a single PXE/ELF executable.

This is the most exciting thing for me in the new dotnet CLI as a person writing Go full-time at Microsoft. Most developers will be able to take advantage of single-binary shipping, they'll just put that in a Docker container and happily run everywhere.


Why does everyone need to bring up Go for what is a standard feature from AOT compilers to native code?


Because static linking has been out of vogue for so incredibly long in the C/C++ worlds, and above that, most languages either compile down to C, or are interpreted.


Depends which C and C++ worlds you mean.

In the GNU/Linux world maybe, but that is just one planet in the C and C++ galaxy.


I mean broadly almost all unixes, Mac, and Windows. All the big platforms. In embedded systems and plan9, static linking never left. But for mainstream development for desktop and server systems? Static linking has been out for a while.


Gamedev is also very pro-static linking. *nix doesn't encompass all of C/C++.


Huh. You learn something every day. But for the most part, everybody still links against libc dynamically. Solaris has actually stopped providing libc.a!


Which makes sense when you think of libc as "the library which provides the system call interface to the OS, and just happens to provide a C runtime library as well".

Making libc dynamic link only allows the OS developers to change the userland/kernel land interface as they see fit without breaking compatibility. The fact that this also ties you to a dynamically linked C runtime library is an artifact of the close ties between Unix and C.

Over in Windows land, there isn't a tight coupling between libc and the system call interface. libc is provided by your compiler (Visual C++ provides static and dynamic versions), while the system call interface is provided by ntdll.dll and friends, which are part of the OS and free to change in future OS versions.


True enough, I suppose. It did break some stuff though. Go apparently encodes the syscall table.

https://news.ycombinator.com/item?id=8817568


Yes, because the changes they did to the library aren't compatible with static linking.

https://blogs.oracle.com/rie/entry/static_linking_where_did_...

However, that is one implementation of one compiler in one specific OS.


True, but it shows the way the wind is blowing.


Because it is 2016 I guess?

So many languages did this for so long and then VMs got in vogue. Go not doing it was, for many people, a reminder it could be done.

But for a managed language runtime like the CLR to start supporting it is a big deal.


Well, mono has supported this for a while now. This is what Xamarin.iOS is based on.


This is true. It's funny because the api's have always existed in the Microsoft compliers but they would litterally throw not implemented errors when run. Mono actually implemented them.


I guess this tells how much mono-culture is still prevalent in the industry, with silo language programming and no CS knowledge.

But I do agree it is a big deal, specially for getting more developers to adopt safer programming languages.


Particularly on the topic of binaries. I've tried using gccgo (the Go backend for GCC, or "the stepchild Go compiler for sake of having a second compiler") for use on an embedded device.

It turns out that gccgo stores critical information for the program execution as debug information, which strip will happily remove, leaving you with an undecipherable error on execution.


Because Go made it popular with mainstream developers.


Mainstream compiled languages (to native code) always supported static linking.

Maybe you mean developers of scripting languages?


C# never really had this if I recall (years ago), and Java had it but required monkeying around.


I specifically singled them out by stating compilation to native code by default. Java and C# compile firstly to intermediate code.

In any case, C# always had this in Singularity OS.

Java always had this in comercial JVMs targeted to embedded market.


For the same reason why gofmt is hailed as a great achievement that's never before been done.

except for every other code formatter ever.


This is, in fact, really cool. Asking as someone who's relatively unfamiliar with .NET - what are the most significant differences between compile --native and ngen.exe?

If I understand ngen correctly, it only compiles to native code at install time - meaning that the assembly and its IL code needs to be included. Is this correct, and are there other key differences that I've missed?


If I'm reading the article correctly, that's a 656KB "Hello World" binary. 671,644 bytes just to print a single string to standard output and exit? That is smaller than the whole .NET framework, but considering that a real native "Hello World" binary is <1KB, what's the other 655KB doing?


While your point is absolutely true, the rule of thumb in those case is that the growing curve in file size is not linear; while the very minimal executable already brings lots of things it doesn't need, adding functionalities make it grow slowly (since a lot is already there).

Not saying this is not an issue (although it isn't really in most fields, except for things like embedded), but that the 656 KB for "Hello World" is because of the "upfront cost" in binary size.

I have spent sometime developing in Delphi years back, and this was even more true with it (simple hello world GUI app was over a megabyte [create new project and compile], complex GUI app was barely above it).


Plus embedded environments which have problems with an executable weighing 656KB aren't really a good target for .NET, to start with.


If it's truly just a single binary, wouldn't that include the whole GC and runtime system? If so its probably the entire core runtime plus a few bytes to console.writeline a string.


> but considering that a real native "Hello World" binary is <1KB, what's the other 655KB doing?

Language runtime.


You may enjoy this dated blog post to give you a hint why the executable is 656KB, not to mention the runtime and garbace collection overhead:

http://blogs.msdn.com/b/abhinaba/archive/2008/09/15/how-many...


There are larger runtimes.

    # echo 'main = putStrLn "hi"' > hi.hs
    # ghc --make -optl=-static -optl=-pthread hi.hs -o hi
    [1 of 1] Compiling Main             ( hi.hs, hi.o )
    Linking hi ...
    # file hi
    hi: ELF 64-bit LSB executable, x86-64, version 1 (SYSV), statically linked, not stripped
    # du -hs hi
    10.6M   hi
    # strip hi
    # du -hs hi
    4.4M    hi
This is against musl libc by the way.


Have you seen the size of go executables generally?


If you truly wanted your app to just print out the one line then you could probably pick a different language.

In this case you would only pick .net if you were making something complex enough to warrant the size cost in exchange for the perceived benefits.

I mean comparing it to a raw native binary isn't an apples to apples comparison. What might be better would be comparing it to another VM based language like java / ruby / python / node. Where the size of the app is the scripts + the installed runtime.

Your one line "console.log('hello world!')" node app may be less than 1k but with the size of the node binary added in it becomes more comparable.


What does Microsoft use Go for?


I imagine it might be the Docker related tooling for Azure.


They also have an Azure SDK for Go which we are using at HashiCorp in Terraform and Packer - https://github.com/Azure/azure-sdk-for-go. I believe the Visual Studio Code editor supports Go as well.


Yes, the Visual Studio Code was demoed at the latest developers' event.


What's the purpose of using Docker if it's a single-binary only?


That's great. Could've used this in a project last year.


I'm so happy to see arguments specified with "-" and "--" (UNIX-style) instead of "/" (DOS-style).

The DOS-style CLI arguments used by Microsoft's other command line tools seem like such a stubborn throwback to a long-lost war. "/" won the battle for folder path separators (see: URLs) and "-" won the battle for argument delimiters.


>The DOS-style CLI arguments used by Microsoft's other command line tools seem like such a stubborn throwback to a long-lost war. "/" ... "-" ...

The reasons were technical[1] instead of a stubborn resistance.

The hyphen "-" was allowed in DOS/Windows filenames. The "/" was not. Therefore, if you programmed your command utility to use "-" as command line switches to match UNIX convention, your app would be "broken" because it couldn't accept filenames that began with hyphens such as "-verify.txt"

If you then try to mitigate that with an "escape" switch such as double-hyphen "--" to turn off the hyphen processing, that means you can't operate on files that are named as "--". (Viruses and malware love creating files with legal filenames that utilities can't open.)

So then you layer another hack on top of that and create the mother of all escape sequences with something like "--switch_character=-" (and then cross your fingers that nobody has a file that's actually named "--switch_character=-".

(Arguably, you could make "myapp -- --" be unambiguous by imposing a rule that position is significant (the 1st "--" is parsed as a semantics switch, and the 2nd "--" is parsed as the oddly named file) but now you've given up the flexibility of specifying parameters in any order which many UNIX utilities do allow.)

Instead of all that complication, you just let "/" be the switch character and it's easier because "/" is already an illegal character for filenames. (When in Rome...[2])

[1]https://en.wikipedia.org/wiki/8.3_filename#Directory_table

[2]https://en.wiktionary.org/wiki/when_in_Rome,_do_as_the_Roman...


Except that Unix also allows "-" in the filename, and as far as I know always has done. Using "--" to mark that all subsequent arguments should be treated as filenames rather than options has worked fine to allow Unix programs to access files beginning with a dash (including files called "--").

The actual reason that DOS uses "/" to mark options isn't technical, it's purely historic: that was the character that CP/M used.


And UNIX allowing - in filenames is a massive security problem[0]. If you go grab a bunch of random scripts off of GitHub many of them won't correctly escape filenames, and a specially named file can alter how the script executes.

[0] http://www.net-security.org/article.php?id=2061


The security problem is incompetent programmers, not dashes, then.


Users and programmers. It is trivial to mistakenly execute a command on the shell which is subject to this issue.

I'd call the behaviour "dangerous by default." You need specific training to be aware of, and overcome the issue, without the training you're likely executing commands which can be taken advantage of (in particular recursive commands over files and directories you yourself don't control).


It's technical in the sense that the historical decision was already made at the OS layer and subsequent DOS/Win programmers have to work with it in the app layer.

Yes, 40 years ago, the decision was arbitrary not technical.


I don't think that's the reason, you could just quote the filename (which is commonly done if the name contains spaces anyway):

    myapp "-verify.txt"


Relying on quotes as special characters is not that simple to parse either:

http://blogs.msdn.com/b/twistylittlepassagesallalike/archive...

That article also doesn't go into the difficulties of quotes in legacy DOS apps. Passing/ignoring quotes in DOS command lines was extremely difficult compared to the Win32 API. The CMD.EXE would eat the quote delimiters and your app wouldn't even see them. (It's been many years since I last tested this so a new OS like Windows 10 may have changed things.)


Fun fact: most MS kernels, including DOS, supported "/" as a path separator. They were just filtered out of the command lines by the argument passing code.

There was a system call which would tell the argument parser that you wanted a different switch character, and if you changed it to, say, "-", then "/" would work. Of course, not everything used the standard argument parser, so if you did change this you ended up in a world of pain because your tools would start behaving inconsistently, but it was theoretically possible.

I've seen references to this code existing as late as XP, but I don't do Windows any more, so can't comment on later versions.


In terms of file path, windows seems to always support \\ or /, both in API or in software like Explorer. That's still true for Windows 10.

That leads to some interesting answers on stackoverflow sometimes about cross compatibility, where the correct answer of "use whatever constant your language has for directory separator" gets down-voted in favor of "use forward slash everywhere".


Because it isn't the correct answer.

Many applications assume \ and deal with paths directly, instead of using the Windows APIs for path manipulation.

This means the moment your application gives a / to another application, there is a high probability that it will break, regardless of what the Windows API supports.

Even cmd doesn't handle / properly.


I know, you will notice I said above "the correct answer [...] gets down-voted", I was not suggesting using forward slash was the way to go.

You can take it as a side criticism of how things are turning sour on stackoverflow with "popular" being more valuable that "right", I guess.



But a path in windows is very unlikely to start with / ... went through some serious pain trying to pass the command options to a windows program from a bash script last night in fact, so it's very fresh in my memory.


LOL, just last night I was updating my start-ssh-agent[1] script for bash in windows to parallel the logic of the cmd file that comes with msysgit[2]. I spent hours trying to figure out a way to pass `/IM` to taskkill without it expanding out to `/c/.../IM` from bash itself... before finally finding a solution[3], when I stopped searching for a generic answer[4].

Painful to say the least.

[1] https://github.com/tracker1/msysgit/blob/patch-1/bin/start-s...

[2] https://github.com/msysgit/msysgit/blob/master/cmd/start-ssh...

[3] http://stackoverflow.com/questions/34647591/passing-windows-...

[4] http://stackoverflow.com/questions/30195353/trouble-running-...


[deleted]


"You make it sound like people killed and died over this."

He really did not.

Also, https://news.ycombinator.com/newsguidelines.html


I think the most exciting thing about this is the AOT compilation into a single binary (Go-style), something I've wanted for a long time [1].

[1] https://news.ycombinator.com/item?id=9590213


Aren't the plans for AOT to work only with WinStore apps?


No.

Currently .NET Native (AOT) is used with WinStore apps - in fact every Win10 .NET application in the store and on your computer HAS to be .NET Native - but the plan is to make compiling a native DLL possible for all .NET Core applications.


no - they first mentioned plans for cross platform go-style single binary compilation at connect (one of the videos here I believe: https://channel9.msdn.com/Events/Visual-Studio/Connect-event...) and I'm pretty sure they've subsequently talked about it in more detail in one of the community standups.


What's the difference between a "go-style" binary and any other statically-linked binary?


None, but apparently many seem to only ever seen static linked binaries in Go.

So many write "go-style" as if it was something that was just invented by the Go team.

So is the knowledge in our industry.


Now now, there are plenty of people in our industry who would cringe at even the term "go-style" as a description for statically linking libs in, though most of them have long since left HN by now. Finally got around to reading that PARC/Cedar doc you linked out a while back, last weekend. Interesting stuff. Thanks again.


Glad you liked it.


I imagine even grognards would describe many things as C/UNIX-style that actually have their roots in earlier systems.


Just ask pjlmp about it. He seems to still be bitter that the lispm and wirth's stuff lost to C and unix.


EDIT: pjmlp. I don't make a habit of getting people's usernames wrong, but that one's hard to read.


No problem. :)


I wrote go-like because I think of C#/.NET and go as somewhat comparable and I'm not aware of any other language in a similar class that produces statically-linked binaries. However I'm pretty ignorant, so would not be surprised if there are many ... examples?


C and C++ will produce statically linked binaries if you ask nicely (gcc -static).

There are some traps though; one I ran into for example is that you require dynamic linking in order to use libnss. I ended up producing a mostly-static binary that just depended on the platform's libc but brought its own libstdc++.

Similarly on Windows, statically linking third-party COM components is not a good idea.


Any programming language that has AOT compilation to native code among its implementation choices.

There are plenty to choose from, but if you want a list:

PL/I, Algol, PL/M, Quick Basic, Turbo Basic, Quick Pascal, Turbo Pascal, C, C++, Ada, Eiffel, Haskell, Objective-C, OCaml, Delphi, Oberon, Oberon-2, Component Pascal, Mesa, Modula-2, Modula-2+, Modula-3,...


Do all of those support static linking with the default tooling? (That is to say, not just hypothetical support, but actual support)

Jeez.


Of course, once upon a time static linking was the only way.

It is dynamic linking that is hard to implement, not the other way around.


I KNOW that. I just wanted to know if the native compilers for the languages you mentioned supported static compilation either through a switch or by default, or if it's something that somebody else would have to write.


Initially by default, many of them are older than OSes with dynamic linking support.

The ones with compilers that also support dynamic linking, either via a switch or via explicit dynamic modules.


I recently rediscovered (lying around on my disk) a bunch of Free Pascal[0] programs that I wrote and compiled in 1999. The executables are all statically linked and still run just fine on my current Linux system. The simple "hello world"-eque ones take up just tens of kilobytes of space.

[0] http://freepascal.org/


A lot of today's developer have never done it with another language before and so knows it as "the thing that go does".


They demoed it for Linux command line and touted that, so I doubt it.


I'm hoping that Winforms/WPF apps are supported at some time soon.


You probably won't. This is .NET Core only. .NET Core is a cross platform slimmed down version of .NET and doesn't run Winforms or desktop WPF. Windows Store WPF is supported.


If they really wanted to they could open source Silverlight GUI framework. That was basically WPF lite. They can take out the video/DRM stuff and just put the GUI layer out there - it was ported to OSX so it's at least somewhat portable.

It would be nice to have something like JavaFX for .NET - resurrected Silverlight could be it.


What I've been doing is in my solutions is adopting the Adobe approach and keeping a separate project for QT with all of the UI 100% abstracted out into that. Cross-platform builds are pretty easy, I get all of the safety of C# along with all of the tooling on NuGet crossplatform, and native UI interfaces on all 3 platforms with QT. The only time I come close to touching Winforms/Silverlight or WPF is if a client is dead-set on using Telerik or whatever.


and drivers.


> I think folks using and learning .NET should have the same experience as with Go or Ruby. > Easy Compile and Run > Just "dotnet run" and it compiles AND executes

As someone who does not follow the .NET landscape that closely can anyone explain why this is a big deal ? Java has been able to do 'javac Hello.java; java Hello' for as long as I can remember. If this was so important why are they getting around to it only now ?


Your story (javac / java) was supported since day 1. You had a compiler (say, mcs or csc) that you could invoke manually, passing the input files (csc Hello.cs) and could run the result (either via 'mono Hello.exe' or just 'Hello.exe').

This is more like maven or leiningen I guess - you can build and run, manage dependencies, create projects. That said, I'm not up to date in the Java landscape.


If java did the same it would be "java Hello" regardless of whether it was built or not. It would compile and run, or just run if it was already compiled. Not sure why that is a big deal but I do think the separation between jre and sdk in java is mostly a headache. I understand the end user doesn't need all the docs etc., but shipping the compiler with the runtime, why not?


It is been done for years by third party JVM vendors, but apparently many are too focused on the OpenJDK.

Also, since Java 8 you can package everything together with the reference JDK, no need for third party tools any longer. This will be improved with the Java 9 modularity and possible AOT compilation in Java 10.


Do you actually use Java like this on any non-trivial project?

This is intended to be a high-level tool that can be used as an alternative to IDEs.


Learning the tooling is a very hard barrier to entry. Not that it's difficult per se, but a lot of people will try something to "see how it is", and being able to just get it to run, right now, no multiple command no nothing, that let them concentrate on the interesting parts.


Yes, for simple 'Hello World' in one class file it is not big deal. But 'java/javac' is very clumsy and insufficient for big project with large number of class files and dependency jars. I hope 'dotnet' will be more sophisticated.


This is more along the lines of gcj than javac.


Dumb question, but if I install this on OS X does it support VB.Net or just C#?

I have a ton of old CLI apps I wrote in VB.Net 10 years or so ago and it would be interesting to port them. Mono's support for VB.Net last I tested was not good.


No VB yet, but that's just because I haven't had time to do it yet -- VB is trivial to add support for.

In fact, the package I used for the compiler binaries contains the csc + vbc compilers, so it's literally just shoveling some command line arguments down the pipe.

It's definitely on my to-do list. :)


I hope you don't mind me replying here. I figure if you're working on the tools you might appreciate some feedback...

Annoyingly it seems to work differently to how DNX builds. With my project [1] I have spent what felt like an age setting up a .NET Core supported project.json that builds with 'dnx build'; the result is horror slow compilation and maxed out CPU usage whenever I build in VS2015, even if nothing has changed - actually borderline unusable. I have rebuilt all of the csproj files and the sln that I originally deleted to make way for the project.json and xproj files. So I now have:

  * project.json per folder
  * <project name>.project.json per folder
  * <project name>.csproj per folder
  * <project name>.xproj per folder
  * <project name>.sln per folder
The per xproj sln is tedious just so I can tick a box saying 'Produce output', like I'd ever not want to produce output. Total mess.

So I thought I'd give this (dotnet CLI) a go. I can't run from the root because there's not a project.json there. If I go into the Core project then it compiles OK (on OSX with no issues), but builds in the <project>/bin folder like MSBuild. That means the other projects in the repo can't find their references like they can with the DNX artifact system. It seems to me that it should support the same DNX project.json/global.json dependency system.

Whilst I support the general direction MS is taking with DNX and 'dotnet'. The whole system is a buggy inconsistent mess and has recently lost me weeks of time trying to figure out the magic combination of settings that are needed to support cross-platform builds (actually most of the time has been spent waiting for builds, waiting for VS2015 to stop locking up, or giving VS2015 enough 'lock up time' before killing it from the Task Manager and starting again).

I'm also not sure MS has quite prepared itself for the public evolution of their tools and source. It has lead to a trail of misinformation throughout the web - and I for one have found the process incredibly difficult to follow. Trying to piece together what is going-on through github issues on the aspnet/dnx/fsprojects repos is tedious. I still don't really know if I'm doing it right and the 'official' help is light on details at best.

[1] https://github.com/louthy/language-ext


Totally understandable -- a lot of this stuff is really new, so we've fallen a bit behind on documentation, partially because we're not exactly sure how some stuff should work, so we don't want to mislead people.

The dotnet-cli stuff is super new so a lot of the tooling is incompatible right now. As things start to settle out more in the next few weeks a plan to write a series of blog posts about how things work, general development approaches, and where you can expect to hit rough edges.

Thanks for the feedback, though! We're definitely a bit new to this style of development.


This looks like it includes C#, VB.NET, F#, and any other language that targets the CLR (Common Language Runtime, i.e., Microsoft's JVM). edit: Apparently not. Mea culpa!

Also, I don't know how long ago you tried Mono + VB, but especially since MS started collaborating with Mono and offering upstream first-party support (~2 years), support for the IL has been almost production quality. (Now it is production quality - I'm running some of my customer ASP.NET applications which was built around the NHibernate rather than EF/MS SQL on full Linux infrastructure).

Satya is making a ton of bold moves that are basically antithetical compared to the Ballmer-era MS. Visual Studio Code (yet-another-electron-based-editor) and Community, along with the ridiculous amount of open stuff on Github has effectively eliminated the vendor-lock-in stronghold that Microsoft previously had. Residual Office renewals and upgrades on the MS SQL stack will only get them so far. They're going full Docker[1] support on the new Windows Server stack, as well as full .NET support off the Windows stack, which is interesting to say the least. I wonder if we'll start seeing them buying out companies that are traditionally aligned with more-open-sourcey.

A bold move they could make would be buying out Cognitect. Datomic could be a perfect fit for them. Integrate Visual Studio support (e.g. as good as IntelliJ/Cursive) and give CLJS first-party support (let Nolen take a lead spot on the TypeScript team) in VS and it could be a steal at even a few hundred million dollars. The talent (Hickey+Nolen) the developers who they bring over + the perception shift within the dev community + the Datomic itself is likely worth a half a billion to MS.

[1] http://www.hanselman.com/blog/BrainstormingDevelopmentWorkfl...


F# support is in progress, next beta i think. pull requests got merged last week


TypeScript(and even C#) is a huge step backwards compared to Clojure and ClojureScript. Immutability is a clumsy thing in TypeScript whereas it's concise and natural in Clojure and ClojureScript.

I doubt a bright person such as David Nolen would like to work with TypeScript as it is right now.

IMHO of course.


I'm pretty sure a few tens of millions (lets say Nolen has 5% equity? I think give or take a few points, that's about right) for 4 years (generally the buy-out terms for a pair of golden-handcuffs) and the freedom to choose the direction in which TS goes (notice I did say "Lead" -- his role would have to be essentially Anders or Meijer's level for it to go through I'm sure) would be equally amenable for all parties involved. (I don't think you understand the freedom engineers at that level in MS get, especially post-Ballmer). Immutability sucks in TS? Oh well, here David, take 5-10 SDEs and do whatever you want.

MS gets the Clojure (and maybe more importantly the CLJS userbase), Cognitect gets massive reach (CLJS is subsidized solely by the licensing of Datomic by strict revenue), MS' corporate customers get Datomic. Just going by the direction MS has been in lately ("embrace & extend") under the new anti-Ballmer role, floating an offer of 300-500 million would be beneficial to everyone involved. Let's say there are conservatively 100k Clj(s) users out there right now using Clojure + Sun's JVM. At 300 million that's 3k per active developer acquired right off the bat. You'll lose maybe 10% of the users who live-by-their-Macbook-and-die-by-their-iPhone. Even at 600 million, that's still a cheap price to pay.


I'd say immutability is just about the only thing that's concise in Clojure/Script and everything else about it is a huge step backwards - especially the lisp syntax. Yuck! (Also IMHO.)


It was maybe 2 or 3 years ago I poked around in it and failed to get much to run.


Oh my god. That is more than promising. That is really really awesome.

Finally Microsoft will put something of that is useful for everybody.


I learn a lot from Hanselman, and as someone bridging multiple languages, platforms, and sdks while at MSFT, this is a new one for me.

I'm adding this to my list for potential 20% time project exploration.

Cross platform and cross language have their interesting aspects...


C# has had a command line compiler since... forever. Is the big news just --native mode?


the --native already exists, it's not a dotnet cli thing

dotnet is a wrapper ( like git command ) for multiple executable, to replace msbuild as project file + build tool and add some new functionalities ( restore nuget packages, etc )

i think the real news is having a dumb project.json (nuget deps,source files,etc) and a xplat tool ( dotnet ) who read this project.json, and can restore packages, compile ( --native too ), publish, packages as another nuget package, etc.

All that xplat and multiple language. c# works, F# got merged last week, other coming next.

Some example commands:

- `dotnet new` scaffold a new project ( really simple atm )

- `dotnet restore` read dependencies from the project.json and download nuget packages

- `dotnet compile` call the csc compiler ( with a wrapper `dotnet-compile-csc` ) with source files, defines, references from project.json, etc

There is a good split of responsabilities, you can call dotnet-compile-csc if you want. dotnet cli bundle the roslyn compiler ( c#, vb ) and f# compiler, so ready to go

for example `dotnet compile`.

dotnet compile read project.json and call ( for c# ) the `dotnet-compile-csc` passing source list, defines, references, etc. dotnet-compile-csc doenst know about project.json, and call csc ( the c# compiler ). if --native, the `dotnet compile` then call a compiler from .net bytecode to native code

It's more like a replacement for msbuild, with a really good command line story, and a common project file ( project.json ) who can be used from visual studio or vs code, or others ide ( it's a json file, dumb )


Completely off topic but I think I saw some of your work in the F# repo being merged upstream. Congratulations if I'm not confusing you with someone else!


thx


Thanks for your comments here - really explained what this thing is. I didn't pick up on that from the blog post.


Like another commenter said, thank you for all your work on getting F# into dotnet core!


The command line compiler didn't include dependency management, project creation etc.?


If you're going to build via cli, you'll already be using 'nuget restore'[1] as part of a bat file or whatever in your build.

And you'd have to be pretty belligerent of VS to use a cli to create a .Net project instead of clicking 'File > New > Project...'. Although the amount of crap they add in to a asp.net project these days is phenomenal. You spend 20 minutes deleting all the junk they've added.

It just seems to be a simplification. In all honesty the article doesn't actually say anything really about why we should use it.

[1] https://docs.nuget.org/consume/command-line-reference#restor...


No, the article makes that rather clear: You don't have VS on other platforms. This is a cli interface for every supported OS/platform, while you had to use msbuild/xbuild previously (and a couple of things on top, nuget being one).

Obviously you can do everything .Net related already - if you're sitting in VS on Windows.


csc.exe did not. In ye olde days, you'd have to use nmake or similar (batch files - gahh!) to manage compiling applications that had multiple source files.


For those interested in projects like this, would recommend following the development of Xebec, it's a very new project (no code releases yet), but aims to offer a much simpler build system for F# projects:

http://7sharpnine.com/Xebec/


Can anyone clarify - does this support any .NET language (C#, F#, mixed language)? In the 'fabulous' example, I can't see where the source code for "Hello World" actually is.



dotnet cli is a new project. It use some good ideas from aspnet/dnx like project.json, but it's more easy to work with, less magic, more language agnostic.

c# is used to bootstrap all tools, so it's the first supported language and the default. Nightly, dev build have f# support, but it's a beta ( merged last week, but it's ok ).

f# dotnet cli (the wrapper for f# compiler) is a beta, but moving fast, not much todo. For example the f# support in dotnet new is in a pull request.

The F# compiler built for coreclr ( bundled with dotnet cli, so ready to go ) is working pretty much, some stuff missing ( portable pdb for example ),

f# coreclr is open source https://github.com/Microsoft/visualfsharp


mixed in same project ( project.json ), no you cannot do that (i mean c# source and f# source in same project ).

You can have multiple project (same repo, different directories ) and reference one from another

you can have each project.json use a different compiler setting the compilerName property ( csc for c#, or fsc for f# ) .

A project can reference another project, so you can reference a F# project in a c# project, or the other way around.

like now in .net with msbuild or other tools


Hopefully the interaction works a little better now than the last multi-language project I worked on... wanted to do XML processing in VB because of the native syntax support, but interacting with the rest of the codebase in C# became such of a disconnect (changes in VB weren't seen in C# until rebuild), it was all converted to C#.


Preach it, brother. That would get this former Microsoftie back to the fold.


.Net CLI, so ... powershell?

This looks pretty cool. I wish powershell were slightly easier to set up and use, and that there was a repo for extensions other than "google it and then download".


Have you seen the capabilities enabled by the PowerShell Gallery[1] (script and module repository) and the PowerShellGet cmdlets[2] (for interacting with repositories like PowerShell Gallery)? It looks like a useful approach to solving this particular issue.

[1] https://technet.microsoft.com/en-us/library/dn807169.aspx [2] https://www.powershellgallery.com/


One of the major tenets of this is (from what I can tell) its multiplatform-ness: it would be able to ship as the tooling in the .NET Core release for any platform.

Powershell is certainly a cool shell, but it is a shell, and that makes it much less likely to be ported+packaged everywhere (Pash exists, but I don't know whether it exists as, say, an Ubuntu PPA.) Whereas, when you release a language, you kind of want its tooling to go wherever it goes.


No

.NET CLI is a set of cross platform command line tools for making .NET applications.


Powershell is a shell that runs on the .Net virtual machine.

Scott is showing off command-line tools for creating and compiling C# projects.


It's a nice idea, and anything that gives .NET developers more power at the command line is going to get a thumbs up from me. But "one command to rule them all" does feels a little broad. To take a ruby/rails/linux equivalent, this is apt-get, ruby, irb, rails and gem all rolled up together.

Perhaps I'm just overly familiar with the unix small tools philosophy.


there are multiple executables

the dotnet is like git, so `dotnet compiler` just call `dotnet-compiler`

the first level tools ( dotnet-compile, dotnet-restore ) read the project json ( the only supported project file atm ) and pass argument to second level tools.

for example:

- dotnet compiler call dotnet-compile

- dotnet-compile read source files, references, defines from project.json and call dotnet-compile-csc with the source files as arguments.

- dotnet-compile-csc call csc

- if --native, another .net bytecode to native tools is called

and csc (csharp compiler) is bundled with the dotnet cli package.


> To take a ruby/rails/linux equivalent, this is apt-get, ruby, irb, rails and gem all rolled up together

Once you have fetched ruby I'd be much happier if it was all one command "ruby" for fetching packages, building, running etc. That doesn't mean it can't be several pieces of code running in the end, but from the end user perspective one top level util for one set of tasks makes sense.

Compare "git lfs foo baz" with "git-lfs foo baz". I think the first is a lot better, even if what happens is closer to the second.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: