Normal Topic Multiple Applications - Shared Globals & Synching (Read 1775 times)
BWETTLAUFER
Full Member
***
Offline



Posts: 216
Location: Cambridge, Ontario
Joined: Apr 9th, 2010
Multiple Applications - Shared Globals & Synching
Mar 26th, 2015 at 2:54am
Print Post Print Post  
Hi everyone!

So, today our Sesame database crashed today -- the .db file wouldn't open on restarting the server executable, and we had to shut down our three branches for 20 minutes while I restored a backup from 30 minutes prior, and then get the server set back up again.  The biggest part of the 20 minute delay was while the first client application logged in and the server loaded our 1.2GB application -- while I watched it load up, an idea occurred to me on how I can make our system more stable, faster, and diversified as we continue to grow.  I'd like to run this really long chain of thoughts by you all and see if this is possible.

Background -- our application has a CRM application, with currently 250,000 consumer records, 2 million note records and about 100,000 payment records tied to each consumer record.  If I have 1-12 users logged in, everything runs smoothly and has no loss of performance.  However, we run our application across three branches with up to 28 users currently.  The Ubuntu server it runs on at 'head office' has 5GHz x 8 cpus, but due to the threading limit of a 32 bit Sesame program, it can only actively use 2 cpu cores at any given time. The biggest slowdown or risk to the database is when we run a mass update, import, or enter notes -- when the mass updates are run, they inevitably uses an xlookup command, which slows down everyone in all of our offices as it hogs up one or more cpu threads -- this problem compounds if it's someone remote to our 'head office' server, because of the delay in the internet connection vs. a local network.

So, my thought today is to set up a master database in our one branch, and a second identical (for the moment) copy in our other offices.  Each office can work independently on their database, running on their own server, and then set up a series of synch functions or commands that can push the data from a mirror database back to the master database.  Each office can work independently on their mirror database, running on their own server, and regularly copy over new notes, statuses, payments, and so on. 

If one server went down (like today), a single branch would be disabled temporarily, but the others wouldn't notice.  As well, if someone ran a (local) mass updated function, it would only affect up to the 12-15 people in their local branch, and eliminate wide area network/internet slowdowns.

To make this work, I'm thinking these are my hurdles -- can anyone make suggestions?

1)      To be able to synch or coordinate new records, I need to keep an ironclad single set of global values that would track a single set of reference IDs for new notes, payments, consumer records, and so on.  We add currently about 25 note records a minute at peak times, so it needs to access global values, increment them up, and not duplicate numbers, even if there is a near-simultaneous request for a new global value -- should I use a text file that is overwritten as new notes are created, should I use @alternateserver to go to the 'master' database (which may cause slowdowns when everyone wants a new number all at once), should I set up a tiny database for @alternateserver to use that only stores global values, or some other option?

2)      Copying from database B or database C to database A is fairly simple, I think -- I would create a 'synch' button that would use @alternateserver and xlookup to go through records in the master database A in the master database A and see if the records exist -- if not, user @alternateserver and xresultset to create those records.  Agreed?

3)      If I can build an on-the-fly synch function, and I can build that into a command button or what have you, that's great for someone refreshing the master database before running a critical report, but can I create a scheduled synch for say 2am?

4)      Rather than synch back from database A to database B and C, it would be better to set up a simple method after synch to copy database A back over database B and C -- is there a non-manual way I could force a copy down to the branch offices, and force their local sesame client app to shut down first, or un-load their file in their server app?  Am I thinking this into something more complicated than it should be?

I realize these ideas and questions are rough as I 'think out loud', but I think this is the best way to keep our company growing -- at our current rate, we'll have 5 branch offices by the end of 2016, and 8 by the end of 2017.  If I build this properly now, it won't matter if we have 3 million consumer records and 8 million related note records, the only thing that will affect speed is the number of local users' activities in the branch.

Thoughts?  Comments?

Thanks!
Blair
  
Back to top
IP Logged
 
The Cow
YaBB Administrator
*****
Offline



Posts: 2530
Joined: Nov 22nd, 2002
Re: Multiple Applications - Shared Globals & Synching
Reply #1 - Mar 26th, 2015 at 1:20pm
Print Post Print Post  
A small question about your big question: why do you have a "threading limit" on Sesame server? Why not let it use as many cores as are available?
  

Mark Lasersohn&&Programmer&&Lantica Software, LLC
Back to top
IP Logged
 
BWETTLAUFER
Full Member
***
Offline



Posts: 216
Location: Cambridge, Ontario
Joined: Apr 9th, 2010
Re: Multiple Applications - Shared Globals & Synching
Reply #2 - Mar 27th, 2015 at 2:50am
Print Post Print Post  
Sir ... that's an awesome question, and if we could fix that, my server performance would increase by 400% and I'd declare you King of the World ... I've been telling Ray my woes, and I can PM you our emails ...
  
Back to top
IP Logged
 
Ray the Reaper
Global Moderator
Members
Lantica Support
*****
Offline


The One & The Only

Posts: 2480
Joined: Aug 20th, 2003
Re: Multiple Applications - Shared Globals & Synching
Reply #3 - Mar 27th, 2015 at 1:42pm
Print Post Print Post  
Hi Blair,

There is no threading limit built into Sesame, being 32 bit we are limited on the amount of RAM, which in turn translates to number of threads, we can access but not the number of cores. I see on 32 bit windows, a 32 bit program is limited to 2024 threads. On 64 bit windows a 32 bit program is limited to 2924-3203 threads depending on OS. I doubt you are hitting those limits.

-Ray
  

Raymond Yoxall Consulting
ray.yoxall@gmail.com
ryoxall@lantica.com
Sesame Applications, Design and Support
Back to top
IP Logged
 
The Cow
YaBB Administrator
*****
Offline



Posts: 2530
Joined: Nov 22nd, 2002
Re: Multiple Applications - Shared Globals & Synching
Reply #4 - Mar 27th, 2015 at 2:00pm
Print Post Print Post  
Blair,
As Ray is pointing out, I think your limitation is more likely in available RAM, not number of CPU cores. A 32 bit program in limited to 4 gig of addressable memory.  On a 64 bit box, with Linux, 16 gig of RAM, and a 32 bit process, Sesame will have that entire 4 gigs available, in that each 32 bit program will have its own virtual address space. But, your application has already crossed the 2 gig limit Windows imposes (3 if you set the large address aware flag), and the closer you get to the limit, the more likely it is that you will begin swapping (using the hard drive like it is memory), which will really slow you down.

If you disagree and can see that Sesame is being limited to 2 cores, on Linux there are settings you can set and unset that can cause a particular program to be locked onto particular cores. If you can see that Sesame is only using 2 cores no matter how busy the server, you will need to investigate whether one of these settings is erroneously set.

As to your big question, I would need to know quite a bit more about how your application is used to better advise you on how to break it up for better performance. For example, in most cases, the best way would be to retire old little used records, essentially keeping a set of archive applications containing the records that rarely get accessed any more. But, because I don't really know how your business accesses data, I can't say if that will work for you.
  

Mark Lasersohn&&Programmer&&Lantica Software, LLC
Back to top
IP Logged
 
BWETTLAUFER
Full Member
***
Offline



Posts: 216
Location: Cambridge, Ontario
Joined: Apr 9th, 2010
Re: Multiple Applications - Shared Globals & Synching
Reply #5 - Mar 30th, 2015 at 12:31pm
Print Post Print Post  
Here is a screen capture of our server system monitor while I'm running a mass update that bogs out everyone ... you can see it's only really using one CPU, and there is tons of memory left.

In Ubuntu Server v.11, where would I check to see if there is a thread cap on my application?
  

SENYWF_C.PNG ( 196 KB | 52 Downloads )
SENYWF_C.PNG
Back to top
IP Logged
 
The Cow
YaBB Administrator
*****
Offline



Posts: 2530
Joined: Nov 22nd, 2002
Re: Multiple Applications - Shared Globals & Synching
Reply #6 - Mar 30th, 2015 at 2:27pm
Print Post Print Post  
Blair,
That doesn't show only one CPU being used and the amount of memory being used may be problematic. A total for the system is not as useful as showing how much is being used by the Sesame server. On one of the other tabs you can show stats for individual processes.

First, CPUs: your graph shows eight CPUs in use, two somewhat heavily. This is probably due to the mass update hogging the various locks that protect data from multi-thread overwrites. Basically, because a mass update can issue commands much faster than can a human, it grabs resources in the Sesame server, locking out the humans briefly, but very frequently.

When you are not running a mass update but do have a lot of active users working, do you see a more even distribution of CPU usage? If so, then there is no system limits on any process using CPUs. Be aware that any chart of CPU usage is going to show minimal usage for interactive activity. Only tight loops with lots of math will actually drive a CPU into high numbers and keep it there. Any I/O will cause the CPU numbers to drop. The number/graph shown in the chart is a running average. So if you do some math and then quickly write the result to disk, the CPU is likely to show a medium average. If you do a lot of math, and no I/O, it will show a relatively high average. If you do a lot of I/O and only a little math, the average will drop to nearly zero.

Can you describe what the mass update is doing and its role in daily activity? It may be possible to schedule it for a less busy time, or even change the MU so that it has less impact on various "locks" in Sesame.

As to you original question, I've been giving it some thought. There are a couple of ways to work this and choosing between them depends on knowing more about what is going on. I have a small utility that no one has seen yet that allows you to synch applications based on criteria. That might be useful to you if you decide to run multiple servers offering up the same application. I am also looking into creating what I've been calling a "data data" server. This would split the raw data out of the main sesame server into one or more separate 32 bit processes, making your memory usage more manageable. The most labor intensive solution I've been looking into is creating a 64 bit version of Sesame 2. I already have a 64 bit version of Sesame 3. That would allow you to use your entire 64 bit address space.

  

Mark Lasersohn&&Programmer&&Lantica Software, LLC
Back to top
IP Logged
 
BWETTLAUFER
Full Member
***
Offline



Posts: 216
Location: Cambridge, Ontario
Joined: Apr 9th, 2010
Re: Multiple Applications - Shared Globals & Synching
Reply #7 - Mar 30th, 2015 at 3:17pm
Print Post Print Post  
Hey sir,

A lot of what we do is based on Mass Updates ... we are running some of the heavier ones after hours, but there are only so many hours in the day.  Here's an example of a command button that assigns files to a staff member, something we do 3-10 times a day:

Code
Select All
var vRSHandle as int
var vprompton as string
var vpromptuser as string
var vpromptamt as int
var vpromptlimit as string
var q as int
var vnote as string
var vcount as int
var vnewnoteno as int
var vnewfileid as int
var vCltList as string
var vEmailList as string
var vHeader as string
var vMessage as string
var b as string


if @Mode() = 1 and @resultsettotal() >0
	{
		vpromptlimit = @xlookup(@FN, @userid, "Staff Screen!StaffRef", "StaffTransferLimit")
		if vpromptlimit <1
		{
			@Msgbox("You Do Not Have Authorization","To Move Files In Batch","")
		}

		if vpromptlimit >0
		{
		vpromptamt = @resultsettotal()
		vpromptamt = @min(vpromptlimit, vpromptamt)
		vpromptuser = @promptforuserinput("Who would you like to assign files to?",@Userid)
		vpromptamt = @promptforuserinput("How many files would you like to assign?",vpromptamt)

// HDS 2014/11/07: Optimized conditional
		if vpromptamt > vpromptlimit
		{
			@Msgbox("You Do Not Have Authorization","For That Number of Files","Reducing To Limit")
			vpromptamt = vpromptlimit
		}

		for q = 1 to vpromptamt
			ResultSetCurrentPosition(q)
			DBColl# = vpromptuser

			vCltList = @appendstringarray(vCltList,DBCltNo)

			vNote = "UPD USER -- " + vpromptuser
			gAddNote(vNote)
		next

/*
// ===================================================
// Create Log showing Files Assigned -- 2013-06-15 BW
// ===================================================

// Get an empty result set
vRSHandle = @XResultSetSearch(@FN, "Tasks!Tasks", SEARCH_MODE_AND,SEARCH_SYNTAX_QA, "TaskID==")

If (vRSHandle > -1)
{
	XResultSetCreateNewRecord(vRSHandle)
 	XResultSetValue(vRSHandle, "TaskTitle", "New Business Assigned -- " +vPromptUser+" - "+vPromptAmt)
 	XResultSetValue(vRSHandle, "TaskDept", "Operations")
 	XResultSetValue(vRSHandle, "TaskOwner", @Userid)
	XResultSetValue(vRSHandle, "TaskDesc", vPromptAmt + " Files Assigned To " + vPromptUser)
	XResultSetValue(vRSHandle, "TaskDdln", @serverdate())
	XResultSetValue(vRSHandle, "TaskComp", @serverdate())
	XResultSetValue(vRSHandle, "TaskDone", 1)

	// Set TaskID Number

	gFileID = @ToNumber(@GlobalValue("gFileID")) +1
 	XResultSetValue(vRSHandle, "TaskID", gFileID)
 	GlobalValue("gPmtID", gFileID)
 	XResultSetClose(vRSHandle)
}
*/

// ============================================
// Email Notice of Assignment -- 2014-06-19 BW
// ============================================

	vCltList = @uniquestringarray(vCltList)
	vEmailList = @appendstringarray(vEmailList,@xlookup(@FN,@userid,"Staff Screen!StaffRef","StaffEmail"))
	vEmailList = @appendstringarray(vEmailList,@xlookup(@FN,vPromptuser,"Staff Screen!StaffRef","StaffEmail"))
	vEmailList = @appendstringarray(vEmailList,"bwettlaufer@kingstondc.com")
	vEmailList = @uniquestringarray(vEmailList)

	vHeader = "Files Assigned -- " + vPromptUser + " -- " + vPromptAmt + " Files"
	vMessage =
	"Collector         -- " + vPromptUser + @newline() +
	"Number of Files   -- " + vPromptAmt + @newline() +
	"Clients Assigned  -- " + vCltList + @newline() +
	"Assigned By       -- " + @userid

	b = @sendmail("mail.kingstondc.com",vHeader,"support@kingstondc.com",vEmailList,"","",vmessage,"bwettlaufer@kingstondc.com","Ari0ch33","")

	ResultSetCurrentPosition(q+1)
}
}

ELSE
{
	@Msgbox("No Files Retrieved To Assign!","","")
}
 



The effect of these Mass Updates are that hitting F10 (which on form entry runs a Xlookup), or re-statusing an account (which also runs an Xlookup) slows down from instantaneous to 3-10 seconds to process.  Of course, it's a bit of a pileup when multiple people are all entering notes, or running Xlookup functions in the background, slowing everything down to 5-20 seconds at worst.

Synching applications would be stellar!  I could have a Master Database, and then synch my branch databases, with a frequency depending on what kind of time/processing drag it would have when run.

If you want to use me as a guinea pig/beta tester, I stand ready to assist. Smiley
  
Back to top
IP Logged
 
The Cow
YaBB Administrator
*****
Offline



Posts: 2530
Joined: Nov 22nd, 2002
Re: Multiple Applications - Shared Globals & Synching
Reply #8 - Mar 31st, 2015 at 2:59pm
Print Post Print Post  
Ray is looking at the synch utility right now.

In the meantime, I'd like a chance to see why the mass update is locking things down so thoroughly. Please email me instructions for running the mass update and a set of instructions for typical user actions that are impeded when the MU is running.
  

Mark Lasersohn&&Programmer&&Lantica Software, LLC
Back to top
IP Logged
 
BWETTLAUFER
Full Member
***
Offline



Posts: 216
Location: Cambridge, Ontario
Joined: Apr 9th, 2010
Re: Multiple Applications - Shared Globals & Synching
Reply #9 - Mar 31st, 2015 at 6:09pm
Print Post Print Post  
PM Sent!
  
Back to top
IP Logged