Thursday, 28 August 2014

IBM lets scientists pay to play with its thinking supercomputer:



'via Blog this'

IBM's Watson promises to usher in a new era of "cognitive computing," but, so far, all the system has demonstrated is a knack for game shows. Now, however, IBM has announced Watson Discovery Advisor, a cloud-based service that'll enable researchers to harness those smarts to do more than put Ken Jennings out of a job. Using the platform, scientists can ask Watson natural-language questions, sending the system to scour every publicly available research paper ever written in every available field. Digesting this information, Watson is then able to identify connections that it would have taken a lifetime for a person to find, which promises to accelerate the speed of scientific discovery. In one instance, the Baylor College of Medicine used Watson to crunch six years worth of cancer protein research into "a matter of weeks." Now all we need to do is scrape together the cash to ask the supercomputer theultimate question...

Tuesday, 26 August 2014

Open City - Events | Open City Event Listings:

Our first project at andDROIDdevs!




1 Finsbury Circus

1 Finsbury Circus

1 Finsbury Circus, EC2M 7EB, London, England
This Lutyens Grade II listed building has been comprehensively redeveloped to provide a high quality contemporary interior, with a fully glazed …

10 Brock Street

10 Brock Street

Regent's Place Estate, NW1 3JL, London, England
A brand new build, completed in the summer of 2013, this 14-storey office building uses modern technology to operate in an environmentally …

10 Downing Street

10 Downing Street

10 Downing Street, SW1A 2AA, London, England
10 Downing Street has been the residence of British Prime Ministers since 1735. Behind its black door have been the most important decisions …

100 Victoria Embankment - 'Unilever House'

100 Victoria Embankment - 'Unilever House'

100 Victoria Embankment, EC4Y 0DY, London, England
Landmark curved Grade II listed building which has been transformed to give it a new lease of life. RIBA Award Winner 2009.

103 Copenhagen Street

103 Copenhagen Street

103 Copenhagen Street, N1 0FL, London, England
New Block of eight sustainable homes and office units, with generous social spaces, good daylight and robust and attractive materials for a …

13 Burgoyne Road

13 Burgoyne Road

13 Burgoyne Road, N4 1AA, London, England
Remodelling and extention of a 3-storey Victorian terrace house, involving sensitive refurbishment of period features complemented by a rich and …

15 Edge Hill

15 Edge Hill

15 Edge Hill, SW19 4LR, London, England
New build rear extension to Victorian house, entailing extensive remodelling of ground floor to create a large open plan flow-through living, dining …

18 Stafford Terrace - The Sambourne Family Home

18 Stafford Terrace - The Sambourne Family Home

18 Stafford Terrace, W8 7BH, London, England
From 1875, the home of the Punch cartoonist Edward Linley Sambourne, his wife Marion, their two children and their live-in servants. Today, the house …

2 Willow Road

2 Willow Road

2 Willow Road, NW3 1TH, London, England
Goldfinger's unique Modernist home, largely in original condition. The house, which has 3 floors at the front and 4 at the back, is designed for …

20 Triton Street - Lend Lease HQ

20 Triton Street - Lend Lease HQ

20 Triton Street, Regent's Place, NW1 3BF, London, England
Lend Lease's EMEA HQ is a showcase for occupant health, productivity and employee engagement. First office fit-out to be awarded BREEAM …
Geek Dreams: Cray CS-Storm Delivers High-Performance Computing In Million-Dollar Package | TechCrunch:



'via Blog this'





Imagine a system with 22 x 2u servers in a 48u rack -all cranking on 176 NVIDIA Tesla K40 GPU chips providing an astonishing 250 Teraflops per rack. We’re talking scream machines and that’s what Cray is delivering in its latest high-performance system called the Cray CS-Storm.
Consider that a four cabinet Cray CS-Storm system is capable of delivering more than one petaflop of peak performance. That’s a mighty powerful system
And it comes at a time when you can get inexpensive computing power with a credit card from infrastructure providers like AWS or cheap machines from white box PC makers. Cray doesn’t try to compete there though, sticking to the high performance computing market as it always has, where companies who make use of these systems, need full-on, pedal to the metal power 24×7/365. And that’s what Cray provides says Barry Bolding, vice president of marketing and business development at Cray.
He points out these machines are not for the faint of heart. In order to justify the cost of one of these babies, which he points out could run you close to a million dollars fully loaded, you need to have specialized needs and you need to run them full bore, pretty much all day long every day. When he says full on, he means it.
“The NVIDIA [Tesla] K40s run at 300 watts when doing maximum calculations,” Bolding explained. And the CS-Storm has been engineered  to run at a maximum of 300 watts all the time, he told me. That means it delivers full power all day long.
To give you a sense of where the cost comes from, a single NVIDIA Tesla K40 chip costs around $4,000. As he said, do the math. There are up to 8 per node and a total capacity of 176 in one rack.  If you have four fully loaded racks, that’s over 700 chips. We’re talking some serious power for serious cash.
Bolding said the system hardware is built on the air-cooled Cray CS300 system, while the software includes the Cray Advanced Cluster Engine cluster management software for resource planning and scheduling plus the complete Cray Programming Environment for building applications on top of the Cray platform, and they also have an integration layer to work with other software. What’s more, customers don’t have to get the fully configured super-duper system if they don’t want it. They can save money by buying just five or 10 servers at whatever configuration meets their needs and budget.
Bolding says the systems have been designed for highly specialized types of jobs that require intense computing such as planet-wide weather forecasting, seismic measuring for oil exploration, financial analysis or crash simulations. All of these require a steady diet of power with thousands of simultaneous calculations that you just can’t break down into a single job. And the intensity is on-going, not just for a short burst.
That’s why he says the cloud just isn’t cost-effective in these instances. While they’re good for that short burst or the short-term capacity boost or even for steady scaling, the super computer comes into play when you need constant, intense computing power. Bolding explained that Cray uses the highest quality components designed to run at maximum capacity virtually all of the time without breaking
He claims companies that need this kind of power can justify the cost because they aren’t going to get the efficiency they achieve with Cray’s engineering with other solutions. He boasts they use the highest quality parts, with the highest quality signal quality so that these components are all working together to the maximum capacity the hardware allows.
If you have the need, you can you justify the cost and get to work. If not, you can sit back and drool and dream your geek dreams like the rest of us.
IMAGE BY COURTESY OF CRAY (IMAGE HAS BEEN MODIFIED)

Wednesday, 13 August 2014

Sean Cull - SNTT : Using Active Directory to authenticate web users:



'via Blog this'



It took me ages to find this on Sean's old blog so just saving it in case. Thanks for the good writeup Sean.



Sean Cull  10 March 2011 21:24:59

Introduction



This article describes how you can use Active Directory via LDAP and Directory Assistance to authenticate your web users. This is particularly useful in our case where we have an XPages based application running in on a black boxed  appliance in a MS shop. 
The example uses a Windows Server 2008 R2 for AD and Domino 8.5.2 running on Linux. The scheme is simple enough but I struggled to piece the bits together so I thought a write up would be useful. 


Useful tools



I found that the Apache Directory Studio was really useful. This allows you to explore the Active Directory LDAP feed and get a feel for its structure. 



Useful debugging parameters



I found the following two parameters very useful because you can see the structures of the names and groups in AD as they are queried by Domino - these settings are for temporary use only as they create overhead and also show users passwords on the console in plain text ( somewhat disconcerting ) 

Webauth_verbose_trace=1 
LDAPDEBUG=1 


Setting up an AD test environment



This was very straight froward. I installed a 2008 R2 server as a VM and used the Server Roles Manager wizard to install Active Directory accepting the defaults and dependencies. 
I then created a new user ( joe bloggs ) and used that account to authenticate the LDAP feed.

Image:SNTT : Using Active Directory to authenticate web users


Exploring the LDAP Feed with Apache Directory Studio



Use File New and then choose LDAP Connection 

Image:SNTT : Using Active Directory to authenticate web users

Image:SNTT : Using Active Directory to authenticate web users

Image:SNTT : Using Active Directory to authenticate web users

Press the check Authentication button and all should be well 

Next you can browse the LDAP tree and see information on the users and groups 

Image:SNTT : Using Active Directory to authenticate web users
The equivalent "Notes name" as used in an ACL would be 

CN=joe bloggs/CN=Users/DC=ad/DC=focul/DC=net 


Image:SNTT : Using Active Directory to authenticate web users


Configuring Domino to use the Active Directory LDAP



You need to create a Directory Assistance Database and then list this in the server record 
The directory assistance template is an advanced template called called Directory Assistance ( da.ntf ) 

The server document entry looks like this 

Image:SNTT : Using Active Directory to authenticate web users

In the Directory Assistance Database create a record as follows. 

Note that Gabriella Davis and Marie Scott  on page 20 of their very useful presentationOne DirectoryTo Rule Them All, Yes suggests encrypting the LDAP configuration document - not sure how to do that just yet. 


Image:SNTT : Using Active Directory to authenticate web users

Image:SNTT : Using Active Directory to authenticate web users

Note that the suggest and verify buttons are very useful, particularly for the Base DN for search 

Image:SNTT : Using Active Directory to authenticate web users


Testing Authentication



Start with the most basic example you can. 
With a test database set anaonymous access to No Access and Default Access to reader or higher. 

Open the URL and attempt to login - in my case as Joe Bloggs. In the console you will see something similar to this : 

Image:SNTT : Using Active Directory to authenticate web users
Your authentication is working. 

You can now test it with a specific name. You can see the shape of the name from the console output 

The AD name CN=joe bloggs,CN=Users,DC=ad,DC=focul,DC=net gets mapped to CN=joe bloggs/CN=Users/DC=ad/DC=focul/DC=net for use in the ACL 
Groups also work but note that if you put a group into the AD as a peer of "Users" the group name construct includes "Builtin" as in CN=testgroup/CN=Builtin/DC=ad/DC=focul/DC=net so it is better to put the groups within the users branch. 

Image:SNTT : Using Active Directory to authenticate web users

In our case the group name is CN=testgroup4/CN=Users/DC=ad/DC=focul/DC=net 

Image:SNTT : Using Active Directory to authenticate web users



Further Integration



This OpenNTF  Active directory name picker project and search by Rishi Sahi looks really interesting. He also has some good blog articles on LDAP integration 


Other useful presentations



As mentioned above I found Gabriella Davis and Marie Scott's presentation very useful - One DirectoryTo Rule Them All, Yes 

I also attended Warren Elsmore's Directory Integration session at ILUG which was very useful. You can download all of the ILUG slides here => http://www.ilug2010.org/ilug/ilug2010.nsf



A mild rant



In pulling this material together I have come to the conclusion that it is a real shame that IBM has not published the slide decks from lotussphere 2011. 

It would make it a lot easier for developers to make the IBM products more popular if IBM asan organisation was a good citizen of the community in that respect. 

I have huge admiration for many individuals within  IBM that do their best despite IBM in this regard. I also think it is unfair to expect the community to contribute to the IBM Wikis when they are sitting on hundreds of excellent presentations by the world experts in this area - experts who gave up thousands of hours to prepare those slide decks. 

Its hardly what I would describe as a good example of a Social Business. 





 Admin Tips  Appliance  Dev Tips  Show-n-Tell Thursday  Active Directory  LDAP  Lotus 




1Marie Scott  10.03.2011 23:59:30  Directory Assistance Database
Sean - to encrypt the Directory Assistance Database you would go to Database properties and select Encryption Settings to locally encrypt the database, so that anyone who may be able to physically access a copy of the database would not be able to review the LDAP password credentials. Additionally, you should enable SSL for the connection to Active Directory. But it does mean that you have to have a secure LDAP port open on the AD side.

2Sean Cull  11.03.2011 0:13:20  Thanks Marie
Thanks Marie, I thought it was encryption of just that document. Encrypting the whole database makes sense.
Thanks for the help, Sean

3Alberto  12.03.2011 8:22:19  Other scenarios
Two more scenarios I've tested. They are relevant when you share the same users in domino and AD.
1- Try Tivoli Directory Integrator to synchronize users. There are couple of good papers about that.
2- Try Websphere plugging in IIS for Web Single Logon. Tip: You'll need to duplicate names in Domino to establish the DN equivalence

4Sean Cull  12.03.2011 8:29:46  Thanks alberto
Thnaks Alberto - you are correct.
I looked at these but quickly discounted them because in this use case the potential customer needs something very simple as they will have no Domino skills at all. TDI is reported to have a steep learning curve and using IIS is complex to set up.
I was quite pleased to find that the LDAP / DA method above was so straight forward once you understood the nomenclature of the names and groups.



5Nick Wall  12.09.2013 10:55:41  LDAP
Just done proof of concept integration of AD on one of our Test servers, works great. Thanks for reference to Apache Directory Studio, nice tool. If anyone is following install instructions here: directory.apache.org/studio/users-guide/ldap_browser/gettingstarted_download_install.html ...and you are as slow on the uptake as me(!)...the bit where adding new remote site, it has the url: directory.apache.org/studio/update/1.x which obviously doesn't exist, so you will get a 404, go here: { Link } click link on one of the versions, e.g. { Link } and use that as url for remote site.
Thanks again for "rounding up" this LDAP info, saved me a bunch of time.

Tuesday, 12 August 2014

Lialis Notes to SharePoint migration - part 2:



'via Blog this'



Notes to SharePoint: Migrating applications away from Notes – part 2

Notes to SharePoint migration part 2, including: Comparison Notes / SharePoint, tools we use and the people we have.
 
6 months ago I wrote the first blog about this project I am working on, moving Notes and Domino applications to SharePoint (SP). We are now (August 2014) developing the SP applications, so I thought it’s a good moment to write down how this project is going. The client started with approximately 200 Notes applications identified, at the end many where archived, many are planned to be moved to SAP and 60 Notes applications will move to SP. Don’t be mislead by the number, some applications are build up by multiple Notes databases.
Let me first start with what I think of SP these days, compared what Domino can do for clients. Please, this is not a complete comparison of the two products.
  • SP has some great features. It has many out of the box functionality, compared with a Domino server on application level has none besides a few useless Notes database templates.
  • The SP user interface (browser) is great, when a new app is deployed in SP users will find the link somewhere in SP when they log in. Compared with the Notes client, the Notes database links have always been a nightmare, users have to manage the workspace them self. I understand now that IBM tried to get rid of the Notes client desktop and replaced it with the bookmark bar which was so bad people were forced to find the hidden workspace again.
  • All SP apps will look the same because of the style sheet used in the browser, I find this great and it will increase the professional look of the app environment, compared with Notes apps, they all look different and most Notes apps I have seen look from design UI point of view very bad, simply because Notes developers are no UI developers. This is where SP does a good job as well, the things you can do from designing an app are streamlined and bordered by SP. What I mean is that all views look the same, all forms look the same – when you stay in the out of the box area SP will help you in making many apps that have a consistent UI, so you users will understand them more quickly because they look and work the same.
  • You know I have been doing Domino for many years, I know this product very well. There must be a area where it’s better as SP. No worries, there is. But will this be a reason for clients to choose Domino in favor of SP, I don’t think so. Domino is better is developing complex Notes and Web applications (with help of xpages). In SP when you want to do complex stuff you are forced to step out of the save out of the box area. Sure you can still do complex stuff in SP, but the development time will become huge because SP is not meant  for building complex applications. On the Domino server you can build anything, it supports the very powerful formula language, lotus script, java script and you can even run java servlets on Domino. With xpages great Web and Notes client apps can be build. It’s nice to have the mail server on the same system as well and Domino is very secure. But why do so many clients choose SP instead of Domino? Simply because they don’t need to build many complex custom applications. When I look at this 8000 users client I am currently migrating away from Notes to SP there is only 1 Notes application that is so complex we have no clue how to build it in SP. Only 1 Notes app!!!! This is not a reason to keep Notes running compared with all the other great features SP offers. I guess the client will try to buy something of the shelf to replace this xpages application finally. Writing this I realize that these days clients can buy so many applications on the internet there is no need to maintain Domino and the developers to build complex apps.
Anyway, how is the migration project going, 6 months after the start? I have to say it’s going great. We are now actually building SP apps to replace the Notes apps. We started with the simple ones and move up to the more complex ones next year. The development time we calculated is huge, so I will have something to do in 2015.
There a 3 items in this project I have found to become very import and are a must have to make projects like this a success. I will share these with you so you can benefit from it.
  • Dell Quest NMSP. We use this product to migrate Notes apps to SP. First NMSP makes an inventory of the Notes apps you have. I use this to see which forms are used in the Notes apps I must migrate to SP. The not used forms I omit. Then NMSP creates the SP lists, it moves all the Notes app ACL, author, readers fields and other security stuff to the SP list, then it creates the columns in SP (= the Notes fields) and finally it will migrate the Notes content to SP and it will try to match all the Notes names it finds with AD user names. At this moment each app we migrate starts with creating a NMSP job. When the jobs are finalized a big deal of the Notes application is present in SP. Then the final SP design work must be done.
  • This leads me to the roles of the people you need. In our project the Notes developer does the NMSP part. This is logical because he understands the Notes apps, The Quest NMSP tool has many features, so if you find a Notes consultant who understands NMSP very well you are lucky, else he/she had to learn to use the tool which will take a few months on the job learning. On Youtube you will find 7 NMSP training sessions which are great. Back to the people in this project. Off course you need a SP developer as well. In my project I do the NMSP stuff because I am the Notes guy. I create the jobs in NMSP and I do some easy development work in SP. Then I tell the SP developer what functions we need in the new SP app. I can do this because I understand how the Notes app works. When all the SP stuff has been build I do the final tests and when the application is 100% it is moved to the client for user acceptance test and then to production. In our project the Notes guy is critical because he does the NMSP work, he instructs the SP development team had he is responsible for testing the end result. Off course the SP consultant has to do a proper job as well.
  • Dell Quest NMSP can migrate the Notes form design to infopath, but infopath is crap so we replaced it with a tool we found on the market spform.com. It’s a Russian tool, it’s cheap and we love it. It has made our live much easier because it speeds up form design work, it can do tabbed tables (which we find a lot on Notes forms), it can make fields not editable when other fields can be edited, it’s a drag and drop tool in the browser and no need to waste time on making pixel perfect forms (it does this for you). In this project we have to migrate 170 forms to SP, so at his moment my company Lialis is building a tool that can convert Notes forms to spform.com forms, this will save a lot of time because the Notes tables on the form will be migrated to spform.com and all the Notes fields will be at the right spot in spform.com tool.
I think in the next blog I will focus more on how the organization succeeds or struggles to implement all the SP apps we develop.
If you need help with migrations like this feel free to contact me at marten@lialis.com
Lialis Migrating Notes applications from Notes to SharePoint:



'via Blog this'



A good article by Marten



Notes to SharePoint: Migrating applications away from Notes – part 1

A real life large enterprise decided to move away from Notes/Domino.  I am part of the Notes to SharePoint migration team with the focus on the Notes application migration. The client already has carried out the mail migration to Exchange (which is easy with the right tools). I don’t know why they decided to move away from Notes. I know IBM renamed Notes to Domino, I don’t care and I stick to using the Notes name when mentioning Notes client and browser apps running on a Domino server.
In this blog I will share the work we have done so far and other experiences that might be important for you. I guess I will write more blogs about this project in the period 2014 till 2016.
My background is 100% Notes, before I started this project I never touched SharePoint or installed Active Directory.
We are now three months into this project and so far we have worked on the following tasks.
  1. Notes application inventory
  2. Quest Notes Migrator For SharePoint evaluation and test
  3. Notes app to SharePoint conversion documentation

Phase 1; Create proper inventory of your Notes applications

A. get rid of not used Notes applications
You will see that many Notes applications are no longer used and do not need to be migrated. So the first task is to get rid of the not used applications. The knowledge of the clients Notes administrators and developers is needed for this phase. In addition we used Quest NMSP to scan all Notes databases on the different Domino servers. This scan gives a great insight in the application usage. This task will take much time because you will have to talk to each application owner and see if the application may be removed.
B. collecting information about the used Notes applications
In this task you must collect information about the Notes application that you will need during the migration. We have collected information like: Application name, Notes databases used within this application, owner, size, short description of application function, usage and some other organization related details.
In addition we tried to group several Notes application with similar functions to one group. The goal is to create one new (SharePoint) application for all these similar Notes apps. You don’t want to end up with 5 different, almost equal form functions point of view, document libraries in SharePoint.
We categorized the Notes application complexity into low, medium or high. We created rules for this job. The reason of this categorization is because the client wanted to get this insight. In addition it’s wise to start with the simple Notes apps.
The most important job of this phase is to decide where to move the Notes application to. You can migrate to SharePoint, Intranet, MS Dynamics, SAP and other platforms within the organization. You can even decide to host the application externally or buy something of the shelf. The client created criteria to make this decision.
Collecting all this information will take much time because you must wait for application owners and architects to respond. I would suggest to start with this phase as early as possible because most clients can manage this phase them self without the help of external (technical) consultants.

Phase 2; Install the proper tools to migrate the Notes applications away from Notes

In this case, for this particular client, we will start with the migration of  Notes applications to SharePoint. Approximately 40% of the Notes apps will go to SP, the rest will go to other platforms. We will start with the tools we need for the migration to SP. Later during the project we might need other tools to migrate notes apps to other platforms (like MS Dynamics), something to worry about later on during the project.
For migrating Notes apps to SP we have chosen Quest NMSP. The reason is that Quest is capable for migrating Notes content, migrating Notes security (ACL, roles, reader author fields), has the notes doc link tracking support inside and can migrate forms and views. But as it turns out now we will not use Quest to migrate forms and views because recreating these from scratch in SP seems to go faster than fixing the Quest mess in the generated Notes forms and views (no offense to Quest). I must say we did not do a proper assessment of the other tools on the market.
In this clients case getting a SP 2013 test environment is taking some time. So we decided to create a SP 2013 environment from scratch for testing Quest and development POC in our own environment. We needed this environment because Quest needs you to install software on the SP server for the link tracking and to speed up the content conversion. Randy of Dell Quest has written a great blog how to setup all servers needed for SP 2013 environment and how to install the Quest tools on it. Me as Notes guy followed this guide and I succeeded in getting these servers up and running and get Quest NMSP services installed and configured. Here is the link to Randy’s blog.
The great advantage of having these servers is that you will understand what it takes to install the Quest tools on SP 2013 server in order to explain it to the client. You can use this environment to give the first demos to the client and you may use it to evaluate the Quest tooling. So far I like what I have seen form Quest point of view.

Phase 3; The phase I am in currently

We (client) have made a very good inventory of the really used Notes applications that must go to SP (and to other platforms used by the client). We have grouped similar Notes apps, from functionality point of view, together with the goal to create one SP app for these Notes apps. We have a SP 2013 server environment with Quest tooling operational and we can demo it to the client.
Now we need to break down, for each Notes application, the functionality within this application. We must write down details on the content in the Notes application and other details like security. So I started writing word documents for each Notes application and capturing the functions in this document. With capturing I mean creating screen captures of all Notes forms, views, actions and so on. The goal of this approach is to create a mid level document explaining the app functions. We will not dive to deep into the application script libraries that, for example, are created to get a Notes workflow in place because this code will not be used because the workflow will be created in a different manner in SP. I hope you understand my approach. So bottom line I write a document with many screen captures which quickly show the applications functions. Then the client reviews this document in order to check if I did not miss functions. When approved by the client it’s time for entering the SharePoint details. So imagine a table with two columns, the left column drills down the functions of the Notes application (without going into the coding and field level). The right column will explain to the application owner how the Notes functions (like screens, forms, workflows and so on) will be recreated in SharePoint. Then the finished document with Notes and SP functions is send back to the application owner for approval. When approved, the development of this application can start in SP.
At this moment we are starting with the development of the first SP apps to replace the Notes apps. So far I can tell that SP has much more default features and functions out off the box included compared with Notes. So (power) users can use the browser to modify views, add columns and can do level designing of forms with InfoPath. With Notes apps you need designer rights and a Notes designer client to do similar work compared with SP (which is too difficult for power users to do, so you need a Notes developer). On the other hand it seems that creating simple and complex applications is easier and faster in Notes compared SP. The main reason is that in Notes you have the Notes designer client which is a very power development tool which can do all development work, compared with the tools you need in SP (which are InfoPath, SharePoint designer and C#) which work slow and are not designer friendly. InfoPath can be used to create SP forms, when you create a form you can’t snap the fields to a grid, so it’s a nightmare to create pixel perfect forms compared with Notes form creation. Anyway, I will keep you posted about the things we learn when moving 60 Notes applications to SP and another 100 Notes applications to other platforms.
It won’t be a walk in the park – that’s for sure.
Thank you.