Personal Blog of Thomas Hampel - Creative Mythbusting in Development and Collaboration

Who am I?

Feeds

Query results for : July 2013

Create a replica without having direct server access- 5 July 2013 - (0) Comments

Thomas Hampel
 5 July 2013

Here the problem:
You want to create a new replica of an existing database on a server which you are responsible for, you are not allowed to access the remote server.
Not having access means your user ID is e.g. in an access deny group, or in a more simple scenario a firewall is blocking direct access.

However, how would you pull a new replica from the remote server down to yours?
The answer is simple - you can set up a replica stub on your server without the need of accessing the remote server.

Step by step instructions

1. Switch to your workspace, make sure you have no database selected.
2. Use File\Replication\New Replica
3. Type the Servername + Filename >from< which you want to pull the replica.

Image:Create a replica without having direct server access
4. Click "Select"
Now your client will try to connect to the remote server, which of course wont work.

Image:Create a replica without having direct server access
5. A dialog box will display, showing an incomplete question

Image:Create a replica without having direct server access
Here you have to select "Yes" without knowing what the question actually means.
Note: Obviously thats a bug, but it seems that it has not been fixed yet.
6. Choose to which server you want to put the replica, also define a file name of your choice.
7. Disable "Create Immediately"

Image:Create a replica without having direct server access
8. Hit okay to create an uninitialized replica stub
9. Last and final step is to replicate this database on console level using the command:

    >pull remoteserver/ou/o localpath/filename.nsf

A note for beginners:
Your server also must be allowed to read from the remote server and the target server needs to know how to reach the source server...so make sure you have propper name resolution or connection documents in place.  

Achieving (a working) high availability with IBM Lotus iNotes- 2 July 2013 - (1) Comments

Thomas Hampel
 2 July 2013

Update: For configuring High Availability for HCL Verse please refer to this technote: Configuring a Proxy for HCL Verse High Availability

We all like well working products and love good documentation, even better when there is a step by step instruction on how to set up a specific configuration to work perfectly.
One of those often referenced instructions is an IBM developerWorks article "
Achieving high availability with IBM Lotus iNotes" based on a product from BigIP F5 which explains a clever reverse proxy configuration for optimizing performance.

Unfortunately the configuration outlined there DOES NOT WORK because it contains multiple errors/failures/mistakes.

Following instructions step by step will make it impossible to get the expected solution in place. Let me explain the problem in more details.


For a small environment with only two servers in one cluster, you wont notice any problem, everything seems to work perfectly.
What you dont know is that the iRule does not work, and traffic is always dispatched to both of your servers. As soon as you will have multiple clusters involved the problem becomes visible.


From time to time users receive "Error 404 - HTTP Web Server: Lotus Notes Exception - File does not exist" which indicate that traffic was routed to a server that does'nt host the file requested.


The (not working) documentation has been published in at least two other places, a DominoWiki Article and a WhitePaper

http://www-10.lotus.com/ldd/dominowiki.nsf/dx/Achieving_high_availability_with_IBM_Lotus_iNotes
http://www.f5.com/pdf/deployment-guides/f5-ibm-inotes-dg.pdf

Lets get back to the roots - according to the developerworks article this is what (in theory) should happen:

BigIP F5 reverse proxy appliance will intercept inbound HTTP requests which end with ".nsf" and are not dedicated to "names.nsf"

Domino will figure out which servers are hosting the requested file and will return a list of server DNS names in form of an HTTP header.


The problems are:
  • BigIP will send traffic to any server in the server pool which is configured - so your session can end up on any randome cluster/server which may not host the database you are looking for.
  • Domino lookups are performed towards the local "cldbdir.nsf" which holds information from databases in this cluster only. What if there are multiple clusters involved?
According to the documentation: "X-Domino-ReplicaServers is returned when the service finds the relevant path within its own cluster, whereas X-Domino-ClusterServers is returned only when the mail servers are part of a different cluster."
but the iRule itself is only referring to "X-Domino-ClusterServers", the other header "X-Domino-ReplicaServers" is never used. #fail !


Lets look into details:

In Domino, a customized ServersLookup form in "iwaredir.nsf" is used to lookup the "cldbdir.nsf" to figure out what servers are hosting the file and will return this information as part of an HTTP header.
Sniffing network traffic using
Wireshark shows that the HTTP header is never returned, it also shows that the URL referenced in the iRule is never called.

According to the iRule documented in
Appendix B is calling the (modified) ServersLookup form to retreive the list of servers as an HTTP header,

HTTP::uri /iwaredir.nsf/ServersLookup?OpenForm&nsfpath=$nsf



unfortunately this iRule is never called., because it is expecting the request URL to >end< with ".nsf"


if { ([HTTP::uri]ends_with ".nsf") and not ([HTTP::uri] contains "names.nsf")}{



Ok, lets try to fix it !

Resolving the problem requires changes on both sides, multiple changes in Domino and changing slightly the F5 iRule. I'm trying to cover the modifications step by step
:

Part 1 - Lets start with the iRule,

here you need to change the if-clause to check for "path" rather than "uri", and also exclude any any lookups towards "iwaredir.nsf", changes are highlighed in bold.


if { ([HTTP::path]ends_with ".nsf") and not ([HTTP::path] contains "iwaredir.nsf") and not ([HTTP::path] contains "names.nsf")}{



Part 2 - Database Catalog

In order to find the correct servers at the first attempt, my idea was to look up the (in our case always perfect) database catalog to find the servers hosting the requested file.

To do that we will need to create a new (hidden) view in the catalog.nsf with two columns
View Formula
SELECT @IsAvailable(ReplicaID)& @IsUnavailable(RepositoryType) Column1 Formula Pathname Column2 Formula ReplicaID2 := @If((@Text(ReplicaID; "*") = "00000000:00001601"); "Non-replicatable files"; ReplicaID);
@Text(ReplicaID2; "*")
Column2 Programmatic Use TextReplicaID







Part 3 - ServersLookup

Now lets make use of the view by updating the code in the "ServersLookup" form of the file iwaredir.nsf.

If no parameter is provided, its assumed the user wants to access his mail server
The code behind the $$HTMLHead field should look like this:



tmpDebug := "";

tmpNSFPath := @ReplaceSubstring(@URLDecode( "Domino"; @UrlQueryString("nsfpath") );"/";"\\");

@If (tmpNSFPath = ""; tmpNSFPath:=@Name([Canonicalize];@NameLookup( [NoUpdate];@UserName; "MailFile" ));"");


REM {Lookup home mail server };

tmpHomeServer:=@Name([Canonicalize];@NameLookup( [NoUpdate];@UserName; "MailServer" ));

tmpLookupKey := @ReplaceSubstring (tmpNSFPath
;"\\";"/") ;

REM {Get replicaID of this mail file};

tmpReplicaID := @DbLookup( "":"" ; "":"catalog.nsf" ; "($LookupServerFilename)" ;tmpLookupKey; "TextReplicaID");


REM {Find all servers who are hosting this replicaID  };

tmpServers := @DbLookup( "":"" ; "":"catalog.nsf" ; "($ReplicaID)" ;tmpReplicaID; "Server");

tmpServers:=@If(@IsError(tmpServers);"";tmpServers);


REM {Is Home Mail server in list of servers, then move this up to the front of the list};

tmpServers := @If(@IsMember(tmpHomeServer;tmpServers);tmpHomeServer : @Transform(tmpServers;"x";@If(x=tmpHomeServer;@Nothing;x));tmpServers);

tmpDNSNames := "";


REM {Resolve host names for each server name in list};

tmpLimit:=@Elements(tmpServers)+1;

@For(n:=1;        n tmpHTTPHostNameALT:=@Subset(@DbLookup( "":"" ; "":"names.nsf" ;"($ServersLookup)" ; tmpServers[n] ; "HTTP_Hostname");1);

tmpServerFQDN:=@Subset(@DbLookup( "":"" ; "":"names.nsf" ; "($ServersLookup)" ; tmpServers[n] ; "SMTPFullHostDomain");1);

tmpString:=tmpString+@Text(n)+tmpHTTPHostNameAlt+tmpServerFQDN;

tmpDNSNames := @If(@Length(tmpDNSNames)>0;tmpDNSNames+",";"") + @LowerCase(@If (tmpHTTPHostNameALT!="";tmpHTTPHostNameALT;tmpServerFQDN))

);

REM {Return results to F5};

@SetHTTPHeader("X-Domino-ClusterServers";tmpDNSNames);

@SetHTTPHeader("Cache-control";"no-store");

@If(tmpDebug="";"";"")



Update:

Session persistence is causing some headaches when F5 needs to select an address from the pool. To work around this issue you can use this iRule

inotes-irule.txt


Result:

No more nasty HTTP404 unless the database really can not be found anywhere.
Of course even this solution depends on a few assumtions, one is the catalog must be up to date and must be replicating within the environment.


Disclaimer: Use at your own risk, no warranty is provided. However, please let me know if you have further suggestions how to improve this solution.
Thomas Hampel, All rights reserved.