...
In the first case we have a "monolithic" application with one main and a lot of functions, this design allows to have a single DB connection and to be able to share variables between programs but does not allow having several "dialogs" open in parallel which for the user is frustrating because he has to close one "dialog" to open another.
We did some tests with the "parallel dialogs" but without success.
In the second case, an application split into a multitude of "main" we gain user flexibility, it can have several dialogs in parallel, but
to the detriment of a high number of DB connections and no "direct" data sharing between the different "main". Another idea might be to be able to launch a main module not as an independent process but as a "fork" of the "main menu" this would allow to have a single connection to the DB (at least with Informix, feature is supported) and to be able to share data.
...
I would've said that the overwhelming majority of the code I see follows the second case. At any given point a single user may have multiple Genero programs running, each program having its own private variables and each program having its own database connections. I can't remember when I last saw your first "monolithic" case.
A typical scenario might be that a store worker might be running
"Main Menu" - program they launch at the beginning of the day from which they launch other programs"
"Sales Entry" - a program to enter new sales
"Debtor Enquiry" -a program they can use to check debtor balances, either launched from Main Menu or from Sales Entry passing a debtor code as an argument
"Stock Enquiry" - a program they use to check stock balances, either launched from Main Menu or from Sales Entry passing a product code as an argument
so this would be 4 seperate database connections, quite possibly 4 copies of the same prepared or declared database cursor, and 4 instances of the same variables e.g. username
Over the course of the day, user might start a few more programs
"Transfer Entry" - to transfer some stock in
"Purchases Entry" - to order some stock
"Customer Maintenance" - to update customer details
etc etc
and I guess the scenario you are worried with this model is by the end of the day, user has all these program open sitting idle but consuming a database connection.
You can be smart about your database connections. As Seb said for case of Main Menu, once the menu is loaded, do you still need the database connection?
Similarly for each program, by design encourage the user to exit the program when they have finished rather than perhaps going to a point ready to enter a new transaction and waiting, so that you don't get lots of programs sitting idle. Use ON IDLE to exit the program etc.
This has reminded me of an early Genero transformation where the database started to slow down. This was because the customer had taken the opportunity to tidy up the code so that database cursors were used rather than static SQL statements preparing the same database operation multiple times. However this resulted in hundreds of Point of Sale terminals throughout the country sitting at an INPUT login, password consuming database memory whilst waiting for the next store worker to come along and process a sale. The solution was to make sure FREE, CLOSE were used to free up resources when program got to this point, that is did we want to hold onto these resources for minutes to save a second or two? 15 years later, I probably also would suggest now using Web Services as a solution so ...
as Seb suggested, you might want to consider a multi-tier solution, add to your tests a third case ' an application split into a multitude of "main" we gain user flexibility,' like you had, but code it so that each application has no database connection but any database activity is done via a Web Service call. I could see this being advantageous in a store or branch situation where you have many instances of the same program, a limited number of SQL statements executed multiple times, and so instead of having 100's, and 1000's of these programs each with their own database connection, and copies of the same database cursor, you end up with a pool (probably numbering in single digits) of web service programs running that have the database connection and cursor in memory. You are then looking at the numbers to measure if the extra overhead in the web service call is worth it in terms of what you save in database resources.
Reuben