[MonoDevelop] Re: More detailed build response

Todd Berman tberman@sevenl.net
Sun, 04 Apr 2004 16:00:02 -0400


Comments inline

On Sun, 2004-04-04 at 14:28 -0400, Miguel de Icaza wrote:
> Hello,
> 
> > //I assume by this you also mean probing for required assemblies.
> > //This is something we should add some automation for. For example, if
> > //I add gtksourceview-sharp as a reference to my project, the build
> > //system needs to be smart enough to add this to my build requirements
> > //
> > //What are we going to be doing to support existing auto* build
> > //systems. Will there be an easy way to run make on a certain
> > //directory?
> > 
> > We want applications to be self contained, so running an application
> > within MonoDevelop will not cause a full make/make install process to be
> > triggered before it can be executed. 
> 
> Yes, that is what it means: to probe for functionality.
> 
> I have chosen pkg-config to probe, since it is something that the
> configure script that we would generate could easily probe for, and
> integrates with the existing probing systems.
> 
> The makefiles on each directory would be self-contained, so you can type
> `make' to build that directory.
> 
> > //Will this be accomplished with something similar to the build/
> > //directory in MD, or will we be doing something different?
> 
> I honestly need input here.  I would personally leave the binaries on
> each directory, but this might not be ideal.
> 
> A few scenarios:
> 
> 	Visual Studio builds into Bin/Debug and Bin/Release, 
> 	one per project.  I do not quite like this setup, I find it
> 	too nested for my taste.
> 

I agree with this completely

> 	Using a common directory for placing all the files like MD,
> 	this seems fairly clean, but complicates the `run' target.
> 

I think it simplifies the run target, but complicates the building, make
run for MD is basically a cd ; mono MonoDevelop.exe. it doesnt get much
simpler than that.

> 	Using the current directory to generate the .exe or .dll seems
> 	the most Unix-y way of doing things.
> 

Well, you have to keep in mind that peoples installed image (lets call
it mono.app just to draw the obvious parallel and make it easy to type,
and by this i mean their fake root, like MonoDevelop's build/) might
need to change over time, and requiring people to lose cvs history / do
surgery is not acceptable.

Also, having a mono.app type setup would allow for a simple xcopy deploy
(in 99% of the cases).

> > //I think we should be adding a third file for this. Another thing
> > //to keep in mind with adding this info to the cmbx file is the
> > //ability to reuse projects.
> > //
> > //Projects themselves should be self contained and know how to build
> > //themselves, so that you can use them from multiple solutions.
> 
> I honestly do not know enough about .cmbx files, or the existing setup,
> can you present what we have today?
> 

Ok, here is the issue. Solutions (cmbx) contain projects. Projects build
a single assembly, either a dll or an exe. The project needs to know how
to build itself, the solution 'make' (both on the shell, and inside MD)
should just iterate through the projects, gather info, and write the
general solution wide stuff (configure, etc). or potentially the
solution would be basically a wrapper around descending into projects
and calling configure, or make, make install, etc.

Does that make sense?

> > //I assume you mean that we can write one (huge) configure script
> > //that we package with MD and then copy into the solutions top
> > //level folder. If so, what are we planning on writing this in?
> > //Also, are we going to use standard configurelike switches, such
> > //as --prefix, --disable-xxxx, --enable-xxxx, etc.
> > 
> > //Will this configure script allow for complicated conditional
> > //switching? (ie, if this lib is present, build yyy, if not, check
> > //for this other library, and do zzz, and if nothing is there, error
> > //out with this info).
> 
> Well, the configure script is automatically generated from the project
> options: optional dependencies listed, libraries required, and projects
> in the solution.  
> 
> The resulting output should be plain shell.   The resulting script
> should use configure-like scripts, it only makes sense.
> 
> I do not know if we want to do more complicated conditional building
> beyond a simple layer, because UI wise, we would enter research land,
> and my intent is to have something working first.
> 

Absolutely, but is there an intended eventual design to support
something like this, or is beyond the scope?

> Doing a setup that handles every possible combination in the world is
> going to be a gargantuan task, so my focus on this design is to address
> common setups, not every setup.

Keep in mind that this build system needs to be self hosting. Which
means it has to handle some non-normal stuff. Like installing into
different prefixes (handle gnome stuff), c code, gettext installation
(generating gmo files from po). I am potentially willing to have a build
system inside MD that is not and never will be self hosting. However
right now I see no benefit in breaking up the MonoDevelop module, as
that would start to require either svn parallel checkouts (ala monodoc/
mcs) or requiring make install.

> 
> > //What about when you need to use C, as many projects do (MD included)
> > //Will we attempt to rewrite auto* for them, or encourage the use of
> > //auto* seperately and add hooks?
> > //
> > //The ability to compile C is a must-have for this build system, as
> > //designing something that is unable to self host is not good.
> 
> If they can be built completely with the pkg-config results, that can be
> handled.  If not, I would encourage the C components to be split into a
> separate module, that can be pkg-config for, and that has all the
> autoconf/automake bells and whistles it needs.
> 

again, see above.

> > //Why not allow for more targets? make dist seems reasonable, as does
> > //make test for running tests. Potentially allowing people to append
> > //content to the makefiles and add their own should be supported.
> 
> I guess it could be added, I do not see why not (other than UI issues,
> and obscure side effects).
> 

initially the ui could be as simple as a text box that adds text to the
end of the makefile. Potentially this could be greyed out with an
applicable warning about how you could break stuff, and to watch out,
dont screw with it without good reason, and whatever you are doing might
be something the current build system should do, so file a bug and see
what happens before screwing stuff up.

> > 
> > //I assume there will be support for flagging files as 'dont build'?
>
> Can you give me an example?

Lets say inside a library i have a file that is a Main.cs to do some
testing using this library. Obviously I only want that build when my
build configuration is 'TestServer', or whatever. That would be a file
flagged as 'dont build this'.

That also brings up another point, currently we support n build
configurations. That is something that I think might be worth
supporting, any thoughts?

> 
> > //I assume there will be gui for all this, where will this gui be
> > //located?
> 
> This is autogenerated.  I do not understand your question.
> 

I mean, have you thought at all about how you are going to arrange the
gui representation of this configurability, or is completely beyond the
current scope (i assume it is).

> > //Will we add support for libraries that this build system produces
> > //to automatically produce an up to date .pc file and install it?
> 
> Fantastic idea.
> 

Ok, this generating .pc file support needs to be project bound, not
solution bound. In fact, the more i think about this, most of this stuff
needs to be project bound, the solution is a simple wrapper with a
'default start target' attribute.

> 
> > //What exactly is the point of this file? to basically provide
> > //#if ENABLE_XXXX support?
> 
> Yup.
> 
> > //What about make distcheck, considering that unless we rewrite auto*
> > //for the bundled c code, people using auto* for c might want to at
> > //least allow that code to be make distcheck'd somehow
> > //
> > //Also, i assume this will output a proper .tar.gz will all the
> > //needed stuff.
> 
> make distcheck is required when a human is required to intervene to
> maintain the files that make up a distribution, to ensure that the human
> did not make a mistake.
> 
> If we are not packaging things completely, it would be a build system
> bug, not a user bug, so we must fix it ourselves.
> 
> That being said, we could have a target like `distcheck' aimed as a
> regression test for our build setup.
> 
> > //personally, i think this is *completely* outside the scope of a 
> > //build system.
> > //potentially providing hooks for this would be more than enough
> > //as every packagers requirements are completely different, and
> > //9 times out of 10, people dont package the apps they develop
> > //beyond simple tarballs.
> 
> Open Source developers do not package anything beyond simple tarballs.
> 
> Proprietary developers (which we hope to attract) would like to have a
> way of building binary packages.
> 
> Either RPM or some other form of self-installable software, and I think
> it is a big plus.
> 

I agree here, but I think supporting tarballs should be our #1 goal, and
then add the rpms or other software later.

> Miguel

My question here:

//Also, how will items such as xml files, or images that need to be
//in a specific location be handled? I know below you address this
//partially with the ability to embed, but not all applications
//will want to use this functionality. And both should be supported

seemed to go unanswered.

Other misc things:

What about running multiple executables on one make run? That seems like
something that should at least be possible. make run should also be
potentially build configuration bound if we support multiple build
configurations. (Useful for stuff outside of debug and release, i
swear! ;) )

I know i am forgetting something else, but I will send a new email when
i figure it out :)

--Todd