f# - Datamining models in FORTRAN or C (or managed code)? -


We are planning to develop a datamining package for windows. In program core / calculation engine, F # will be included in GUI accessories / DB binding etc. in C # and F #.

However, we have not yet decided on model implementation. Since we need high performance, we can not use the managed code here (is there any objection?). The question is, is it fair to develop models in Fortran or should we be on C (or maybe C ++)? We are trying to use OpenCL at some point for suitable models - for these situations -> FORTRAN -> C -> OpenCL seems strange to go from managed code for invoval.

Any suggestions?

F # collects CLR, which has a compiler from time to time, it's a ML The dialect, which is strongly typed, can lead to all types of adaptations that go with the type of architecture; This means that you will probably get proper performance from F #, for comparison, you can also try porting your code to OCAML (to compile the IIRC in the original code) and see whether it is a physical Makes the difference.

If it is actually very slow, then see how far you can see the scaling hardware. With the available performance via a modern PC or server it does not seem that you will not need to go overseas until you are actually working with a data set. Users with small data sets can recover on normal PCs.

Workstations can give you an order of greater capacity than the standard deckstop PC. A high-end workstation such as HP or (similar kit is available from many other manufacturers) can take several options for two 4 or 6 core CPU chips, ten gigabytes of RAM (in some cases up to 192 GB) and high speed I / O like SAS discs, or these types of hardware are expensive but programmers can be cheaper than a large part of the time. Your existing desktop support infrastructure should be capable of this kind of kit. The most probable problem is the compatibility issues that are running 32-bit software on 64-bit O / S in this case. You have several options like working around compatibility issues for working for VMS or KVM switches.

The next step is the 4 or 8 socket server that can go up to a fairly simple circle server 8 socket (32-48 cores) and possibly up to 512 GB RAM - without transferring without a winell platform.

Finally, if you can not run it quickly in F #, then validate the F # prototype and build a build. Use C as a control using F # prototype. If it is still not fast enough then you have problems.

If your application can be configured in a way that is platform friendly, then you can see another foreign platform. Depending on what will work with your application, you may be able to host it on a cluster, cloud provider, or build core engine on one or the other, in doing so (enough) additional costs and foreigners The dependencies you are receiving may be the cause of the support problems. You may also have to take third party counselors who know how to program the forum.

After all, the best advice is this: Suck it and watch it. If you are relaxing with F #, then you should be able to prototype your application quite quickly. See how fast it runs and do not worry too much about performance unless you have a clear indication that this is really an issue. Remember, Nut said that adaptation before time is the root cause of all evils About 97% of the time Keep the weather's eyes for problems and if you think that there will be really trouble, then your strategy Re-evaluate.

Edit: If you want to create a packaged app then you may be more performance-sensitive than otherwise, in this case the display will probably become an issue which is a bespoke Happens with the system. However, this does not affect the basic 'suck and see' theory.


  1. For example, at the risk of starting a game of Bazonz Bingo, if your app can be parallel and designed to work on some shared architecture Maybe, you can see whether a cloud server provider [duck] can be motivated to host or not. A suitable front-end can be made locally or via the browser.
    However, the Internet connection becomes an obstacle to the data source on this type of architecture, if you have a large data set Then the problem of uploading the service provider becomes a problem. It can be faster to take action on a large dataset locally instead of uploading it through an internet connection.

Comments

Popular posts from this blog

asp.net - Javascript/DOM Why is does my form not support submit()? -

sockets - Delphi: TTcpServer, connection reset when reading -

javascript - Classic ASP "ExecuteGlobal" statement acting differently on two servers -