Zeiss Optoplex NGQ files may have a row with explicit wavelength values – or not.
If wavelength information is present there is an empty line before and after the row with wavelength values. We rely on the assumption that the generated wavelength values are equidistant.
In the case of missing wavelength information we count the number of spectral points. It is assumed that the first one belongs to a wavelength of 380 nm, and that the difference between neighbored points is always 5 nm.
Our software can now import NGQ files both with and without wavelength row.
Having investigated a case where CODE stopped working after importing an older method, we found that a data field contained numbers marked as NAN (= not a number). Up to now we don’t know how this situation can arise – probably it was caused by a failing import of measured data.
The following mechanisms have been implemented to make the software survive this situation (active starting with object generation 5.09):
- Directly after loading numbers marked as NAN are replaced by zeroes. This check is done for configurations stored with generations lower than 5.09.
- Starting with object generation 5.09, numbers of data fields are checked for NAN status (and replaced by zeroes if necessary) before they are saved in a configuration file.
- After each import of measured data the relevant data field is checked for NAN status and eventually corrected.
When you load a new configuration, SCOUT adjusts its main window dimensions (position and size) according to the values stored when the configuration was saved.
On the other hand, position and size of the main window of CODE are not changed when you load a new configuration.
Starting with object generation 5.08 you can now have both behaviors in both applications. You can use the menu command ‘File/Options/Appearance/Ignore main window settings after import of configuration file’ to change the behavior – it works like a checkbox. The state of the checkbox is saved to the file ‘additional_default_settings.ini’ in the user’s application folder when you close the application. The file is read when the application starts and the behavior us adjusted according to the settings found in the file.
The BREIN cleaner tool now compresses complete product folders of the BREIN archive. This is done on demand, like it is possible for user-selected day or month folders.
The current day folder is compressed automatically as long as the cleaner tool is active.
Starting with object generation 5.05 the list of spectra offers a new object type called ‘Function fit’. It allows the user to define a function which may contain terms that retrieve optical functions, fit parameters and integral quantities (the latter in CODE only, but not in SCOUT). The function value is computed based on the current model and compared to a user-defined target value, or an interval of allowed target values. The fit deviation of the object is the absolute difference between current value and target value (or closest limit of the target range), taken to a user-defined power. If the function value stays within the target interval the deviation is zero.
Objects of this type can be used to impose restrictions to the model. If, for example, an optical function returns the real part n of the refractive index at a certain wavelength, you can define a target interval for n and generate a large deviation outside the target interval. This forces the fit to look for solutions that are compatible with the given range of refractive index values.
Design targets for color values or other integrated spectral values may not always be well-defined numbers. You can also search for designs where color values stay within tolerated boundaries, but it does not matter where exactly.
Such design situations can be handled in CODE using so-called ‘penalty shape functions’. However, the use of this concept turned out to be rather complicated.
We have now (starting with version 5.02) introduced a very easy definition of tolerated intervals for integral quantities: Instead of typing in the target value you can define an interval by entering 2 numbers with 3 dots in between, like ’23 … 56′ or ‘-10 … -8’. If the integral value lies within the interval its contribution to the total fit deviation is zero. Outside the interval the squared difference between actual value and closest interval boundary is taken, multiplied by the weight of the quantity.
Based on the WOSP-LEDO light source we have developed a spectrometer system that can be used to measure reflectance spectra in the wavelength range 270 … 900 nm. The light source is based on LEDs only.
2 array spectrometers (not shown in the images) are used to record the sample signal and a reference signal at the same time. This makes the results independent of light source drifts.
Samples must be positioned within a few mm distance to the sample port of the system. The measured spectra are tolerant against small sample tilts or height differences.
Spectral quality is the same as for WOSP-LEDO. The unit can be used as a light source for transmittance measurements as well.
The system requires an external power supply of 12 V DC.
There is a new video about ‘ bulk analysis’, i.e. doing optical fitting at high average speed using several instances of SCOUT and CODE in parallel.
Times change, and the way we handle software licenses is changing, too. The info page has been updated recently (=today) – we are sure that for most users nothing will change in the way SCOUT and CODE are used.
Due to the nice competition of AMD and Intel computers with many cores become more and more affordable. Unfortunately, our software products SCOUT and CODE use only one core at a time up to now. In order to benefit from increased computing power of modern processors we have implemented MIPSS which means Multiple Instances for Parallel Spectrum Simulation.
MIPSS can be used to significantly speed up batch control computations. We have published a video tutorial that shows how to do this.
In addition, a new mechanism called ‘Bulk analysis’ has been implemented both in SCOUT and CODE. It is applied when a large number of spectra need to be processed in a short time. A tutorial video about bulk analysis is in preparation.