I have faced a lot of issues with Publishing like when you need to make small changes on the code, sometimes the generated DLL file (the dll file for example of
default.aspx.CS when published) cannot be recognized by IIS saying the codebehind is wrong or something. Sorry for not remembering the exact error message. I am hoping you know what I mean at this point.
Therefore, I usually do a simple
Copy Paste operation instead of Publishing.
Could you tell me what am I missing by NOT using the Publish method? How is Publishing better? Or which one do you prefer, why?
Basically its a pros and cons situation.
Thank you for visiting the Q&A section on Magenaut. Please note that all the answers may not help you solve the issue immediately. So please treat them as advisements. If you found the post helpful (or not), leave a comment & I’ll get back to you as soon as possible.
Well, it depends on what you mean by “copy”:
Publishing you have options to
pre-compile all or part of your application. You can
publish to a local folder in your file system (instead of your target/host) and then copy the updated file(s) (only). If you are making “code behind” (c#/vb code) changes, this means you’ll likely only need to “copy”/overwrite
dlls. Goes without saying that if you’ve made “content” changes (html/razor/script/etc) changes, then you’d need to copy/overwrite those as well.
If you’re new to deployment, you may find yourself simply copying/overwriting “everything” which is the safest way to go. Once you get more experience, you’ll “recognize” which assets you only need to update (one or a few
dlls and or content code, instead of “everything”). There’s no magic to this, usually, its a matter of just looking at the timestamp of the dll/file after you’ve
published (locally) or
rebuild your web application.
I’d recommend doing a
local publish so you can see what is actually needed on your server. The files published to your local file system/folder is what needs to be on your host/server. Doing so will visualize and remove whatever “mystery” there is to
- you’ll see what is actually needed (on your server) vs. what’s not
- you’ll see the file timesstamps which will help you recognize what files were actually changed vs those that weren’t (and therefore don’t need to be updated).
- once you get the hang of it, you will not need to “copy”/ftp “everything” and just update files that were actually modified (only).
So “copy” can mean the above, or if you are saying you will simply copy all of your development code (raw
(vb/cs)html/cs/vb) to your host, then that means your site will be
dynamically compiled as each resource is needed/requested (nothing is
pre-compiled). Also “easy” but you do lose
pre-compilation which means there is a delay when each of your web pages are requested/needed (ASP.net needs to dynamically compile). Additionally, you are also exposing your source code on the server. It may not mean much depending on your situation, but it is one more thing to consider.
Here’s more info on pre-compilation and options.
Assuming we consider an aspx page and its aspx.cs code behind file, there are three alternative ways of deploying your site:
- You can copy both to iis. The aspx will be compiled to .cs upon the first request and then both .cses will be compiled to a temp .dll
- You can “publish” to iis, this will compile the code behind class to .dll but will copy the aspx untouched. The aspx will be translated to .cs and then to .dll upon the first request
- You can “publish” the site and then manually precompile it with the aspnet_compiler. Publishing will compile the code behind to .dll as previously but then precompilation will clear out your .aspx files by removing their content and moving the compiled code to yet another .dll.
All three models have their pros and cons.
First one is the easiest to update incrementally but in the same time is the most open to unwanted modifications.
Second is also easy, can be invoked from vs, it closes the possibility of some unwanted modifications at the server but .aspxses still need time to compile upon the first request
Third takes the time and some manual actions but prevents any changes and also speeds up the warm up of the site as the compilation of assets is not necessary. It is great for shared environments.