1/31/09

Inorganic Chemistry (3rd Edition)



Excellent Inorganic Textbook5
There are many insufficient inorganic textbooks out there, but Housecroft is not one of them. With the easy to follow, but in depth discussions and lots of practice questions this is a great inorganic book for anyone learning inorganic chemistry. Very recommended!

Great intro and reference5
Great introduction to inorganic chemistry.
The first chapters introduce the theoretical concepts which will later be useful to describe the chemistry of the elements, and do so in a fool-proof way. The discussion is obviously not that detailed, but gets to the point.

The best part of this book is however the one regarding the elements and their chemical behavior. Extremely educative the first time you see it, still very useful later on as a reference. The numerous references to how the elements find use in today's industrial world are a great source of information.
The two chapters on catalysis and solid state chemistry are also valuable, albeit concise.
The only book which may rival this one is Greenwood's, which isn't as good as this one as a learning text though.

Weird the price is so high at amazon... i got it for half the price... new...

LGO's are not my friends!4
Well, just finished up Inorganic chemistry and I found this book to be somwhat helpful. The first few chapters of this book are not too bad and it does a decent job with explaining introductory molecular symmetry (point group, I.R. Spectroscopy....) My issue with this book was it's explanation of bonding in polyatomic molecules. It's ligand group approach was so convoluted I had to use my PCHM book (McQuarrie) to paint somewhat of a clearer picture. Also, the chapter on d-block chemistry could use more math to explain crystal field theory. Anyhow, I only used two different books so I cannot comment on what book to use and/or not to use. Hopes this helps....Oh yeah, Honk if you passed PCHM!

About Inorganic Chemistry (3rd Edition) detail

  • Amazon Sales Rank: #11962 in Books
  • Published on: 2007-12-09
  • Original language: English
  • Number of items: 1
  • Binding: Paperback
  • 1136 pages

Inorganic Chemistry (3rd Edition) Description

A market leading textbook offering a fresh and engaging approach to the teaching of modern inorganic chemistry while giving a clear, well-balanced introduction to the key principles of the subject. The full-colour text design with three dimensional illustrations brings the subject to life. Throughout the book students are able to reinforce their learning with the use of worked examples and self-study exercises. Numerous applications and topic boxes also relate the chemistry to everyday life.


1/30/09

Chemometrics: A Practical Guide (Wiley-Interscience Series on Laboratory Automation)



A "Users Guide" to chemometrics5
Chemometrics by Beebe et al. is my first recommendation to colleagues interested in the successful use and theory of chemometrics. It is a user focused book with theory sufficient to guide the appropriate and informed use of chemometric techniques in a variety of analyses. It is a "hard read" in the best sense of that phrase. You will probably have to work at getting through this book, but you do not need a degree in mathematics to understand and enjoy it.

If you are using one of the many "shrink wrapped" chemometrics software packages available today, and you are troubled by these techniques great utility, and your limited understanding of the underlying theory, Beebe is a great a place to start.

Chemometrics for the near-beginner5
The authors have put together a supremely useful guide to actually using chemometric techniques, as opposed to academic research. Consequently, my copy is getting dog-eared from frequent use. Perhaps the most helpful part of the book is the multitude of tables that explain when each different technique might best be used, and how to understand and interpret the diagnostics that arise from the calculations. The table of questions to ask BEFORE an experiment is worth the price of the book. If you're trying to write new chemometric algorithms, buy something else. But if you're trying to apply chemometric techniques in the 'real world' with the highest information-to-effort ratio, you need a copy of this book.

Solid, well written book5
Admittedly, I'm biased as the authors are friends. However, this is one of the best books on chemometrics, particularly for those who have some exposure to the subject and wish to learn more. The authors have drawn on their years of experience explaining complex results to non-expert clients to present some complex mathematical ideas in an understandable fashion.

About Chemometrics: A Practical Guide (Wiley-Interscience Series on Laboratory Automation) detail

  • Amazon Sales Rank: #878739 in Books
  • Published on: 1998-03-31
  • Original language: English
  • Number of items: 1
  • Binding: Hardcover
  • 360 pages

Chemometrics: A Practical Guide (Wiley-Interscience Series on Laboratory Automation) Description

An outstanding practical guide to the most common chemometric methods in use today

Chemometrics explains how to apply the most widely used pattern recognition and multivariate calibration techniques to solve data analysis problems. This practical guide describes all key methods in terms of processes and applications in order to help the reader easily identify the best technique for a given situation.

Drawing on years of industrial experience with chemometric tools, the authors share their six basic steps, or "habits," for achieving reliable chemometric results, and cover key areas such as:
* Defining and understanding the problem
* Experimental planning and design
* Preprocessing of samples and variables
* Supervised and unsupervised pattern recognition
* Classical and inverse methods of multivariate calibration

Complete with helpful chapter-end summaries, technical references, and more, this book is an invaluable hands-on resource for analytical chemists and laboratory scientists who use chemometrics in their work.


1/29/09

How to Use Excel in Analytical Chemistry and in General Scientific Data Analysis



nice4
I'd give it 5 stars, but I've only used about 3 pages thus far.

How to use Excel in Analytical Chemistry by Robert de Levie5
Excellent book setting out applications in Chemistry using Excel. Well chosen examples embracing spectroscopy, chromatography, electrochemistry, kinetics and acid-base theory illustrate the ease of minipulation of data via Excel. The section on Fourier Transform I really liked, and this is followed by a good coverage of noise filtering and deconvolution.

About How to Use Excel in Analytical Chemistry and in General Scientific Data Analysis detail

  • Amazon Sales Rank: #619550 in Books
  • Published on: 2001-02-15
  • Original language: English
  • Number of items: 1
  • Binding: Paperback
  • 502 pages

How to Use Excel in Analytical Chemistry and in General Scientific Data Analysis Description

Spreadsheets provide one of the most easily learned routes to scientific computing. This book uses Excel®, the most powerful spreadsheet available, to explore and solve problems in general and chemical data analysis. It follows the usual sequence of college textbooks in analytical chemistry: statistics, chemical equilibria, pH calculations, titrations, and instrumental methods such as chromatography, spectrometry, and electroanalysis. The text contains many examples of data analysis, and uses spreadsheets for numerical simulations and testing analytical procedures. It treats modern data analysis methods such as linear and nonlinear least squares in great detail, as well as methods based on Fourier transformation. It shows how matrix methods can be powerful tools in data analysis, and how easily these are implemented on a spreadsheet. It describes in detail how to simulate chemical kinetics on a spreadsheet. It also introduces the reader to the use of VBA, the macro language of Microsoft Office, which lets the user import higher-level computer programs into the spreadsheet.


1/28/09

Moisture Determination

Statistics and Chemometrics for Analytical Chemistry



Very useful book!4
This is an excellent statistics book for chemists. It has good examples of data from chemical assays and provides relevant analysis of the data to illustrate the points. I work in the pharmaceutical industry and I use it all the time, to give myself a good, all-round understanding of statistics for chemistry. This book doesn't go into in-depth statistics, so if you need something more meaty, look elsewhere. Overall, I found this book to have everything I've needed. I recommend it highly.

I don't know this edition, but the 2nd Edition is great!5
I don't know this edition, but I think the 2nd edition is great! I'm a Ph.D. chemist, but wanted to learn more statistics, and learn it better. I found this book concise, and I thought the authors explained fundamental concepts and did so with admirable clarity. Me and many of my colleagues consider it our best reference on statistics!

About Statistics and Chemometrics for Analytical Chemistry detail

  • Amazon Sales Rank: #333183 in Books
  • Published on: 2005-12-30
  • Original language: English
  • Number of items: 1
  • Binding: Paperback
  • 268 pages

Statistics and Chemometrics for Analytical Chemistry Description

This popular textbook gives a clear account of the principles of the main statistical methods used in modern analytical laboratories. Such methods underpin high quality analyses in areas such as the safety of food, water and medicines, environmental monitoring, and chemical manufacturing. The treatment throughout emphasises the underlying statistical ideas, and no detailed knowledge of mathematics is required. There are numerous worked examples, including the use of Microsoft Excel and Minitab, and a large number of student exercises, many of them based on examples from the analytical literature. This book is aimed at undergraduate and graduate courses in Analytical Chemistry and related topics. It will also be a valuable resource for researchers and chemists working in analytical chemistry.


1/27/09

Chemometric Techniques for Quantitative Analysis



Most ideal tutorial of chemometrics5
Chemometrics is not intrisically difficult to understand, but it has a high barrier especially for beginners. One of the reasons may be that most of tutorials employses unnecessarily many equations, and they describe claculation algorithms too much. Nonetheless, the basic concept of PCR and PLS should be described on the line of CLS and ILS. The discovery of advantage of ILS, and the idea of using latent variables by PCA are the most important to fully understand PCR and PLS. Richard Kramer writes the book along this concept, and we can comprehensively understand the ideas of maltivariate analysis of spectra. He uses many figures, as if we were performing simulations by ourself. After fully understanding of this book, we would understand other tutorials more deeply and easily.

About Chemometric Techniques for Quantitative Analysis detail

  • Amazon Sales Rank: #918909 in Books
  • Published on: 1998-06-10
  • Original language: English
  • Number of items: 1
  • Binding: Hardcover
  • 220 pages

Chemometric Techniques for Quantitative Analysis Description

Shows how to produce and use qualitative analytical calibrations in a laboratory or production environment, estimate time and resources required to develop analytical calibrations, and employ the quantitative software provided with a wide range of instruments and commercial software packages. DLC: Chemistry, Analytic - Quantitative - Statistical methods.


Handbook of Near-infrared Analysis



About Handbook of Near-Infrared Analysis, Third Edition (Practical Spectroscopy) detail

  • Amazon Sales Rank: #523934 in Books
  • Published on: 2007-09-07
  • Original language: English
  • Number of items: 1
  • Binding: Hardcover
  • 826 pages

Handbook of Near-Infrared Analysis, Third Edition (Practical Spectroscopy) Description

Fast, inexpensive, and easy-to-use, near-infrared (NIR) spectroscopy can be used to analyze small samples of virtually any composition. The Handbook of Near Infrared Analysis, Third Edition explains how to perform accurate as well as time- and cost-effective analyses across a growing spectrum of disciplines. Presenting nearly 50% new and revised material, this thoroughly updated edition incorporates the latest advances in instrumentation, computerization, calibration, and method development in NIR spectroscopy. The book underscores current trends in sample preparation, calibration transfer, process control, data analysis, and commercial NIR instrumentation. New chapters highlight novel applications including the analysis of agro-forestry products, polymers, blood, and control serum. They also cover NIR spectra, process analytical technologies (PAT), quantitative and qualitative analyses for nutraceuticals, NIR photography uses in medicine, and counterfeit detection methods for pharmaceuticals and currency. Offering the most complete single-source guide of its kind, the Handbook of Near Infrared Analysis, Third Edition continues to offer practicing chemists and spectroscopists an unparalleled combination of theoretical foundations, cutting-edge applications, and practical experience provided firsthand by more than 60 experts in the field.


1/26/09

Optical Process Probe integrates sapphire window.

Transmission probe SWOPP03C has polished sapphire lens sealed into 316 stainless steel body, offering chemically inert, abrasion-resistant interface to material under test. Along with 2,040°C melting point and 0.17-5.5 micron optical transmission range, sapphire is trigonal crystal of aluminum oxide that has Young's Modulus of 330 GPa and shear modulus of circa 150 GPa. Probe comes with 600 micron internal glass fiber and permits measurements at pressures to 10,000 psi and 400°C.

1/25/09

Potential of near-infrared reflectance spectroscopy and chemometrics to predict soil organic carbon fractions

The potential of near-infrared reflectance spectroscopy (NIRS) to predict soil organic C in different particle-size fractions was evaluated. Soil samples (n = 180) from various crop rotations in Uruguay were analysed by standard chemical and NIRS methods. Partial least squares (PLS) regression with cross validation was used to develop calibrations between reference data and NIRS spectra (n = 87) and validated using an independent set of samples (n = 87). Coefficients of determination in calibration (View the MathML source) and standard errors in cross validation (SECV) were 0.90 and 0.6 for coarse-sand C, 0.92 and 0.4 for fine-sand C, and 0.96 and 2.1 for clay + silt C, respectively. Calibrations were poor for C/N ratio (View the MathML source < 0.65). Although NIRS demonstrated great potential to predict soil organic C in different particle-size fractions, the nature of sampling and number of samples analysed should be considered in future developments.

1/23/09

Near Infrared Analysis (NIR)

As a Process Analytical Technology (PAT) tool, the NIR (Near Infrared) analyser is the next generation technology for analysing solid and liquid chemical and pharmaceutical formulations. The patented design of NIR offers superior analytical performance with increased sensitivity and precise instrument matching to enhance method development, minimize implementation time and ensure seamless method transferability.

Near Infrared Analysis is perfect for the rapid non-destructive quantification of many constituents-including protein, moisture and fat. NIR analyser measures both liquid and powder samples without removing the sampling or optical accessories. The detector used is a DTGC detector. When changing between liquid and solid analysis, a simple manual lever is rotated to open or close a shutter between the side port and the top bench. Liquid samples are measured in transmission with disposable 8 mm vials. Powder samples are measured on the side port accessory in reflectance through disposable scintillation vials.

NIR analyser can be used throughout your manufacturing processes-from raw material inspection and in process testing to final product release. It is a relatively new technology for refineries to monitor main products, intermediate streams and unit feed. NIR analyser offered a low cost alternative to traditional gas chromatographs or distillation analysers. Unlike Gas Chromatographs, NIR probes are 100% optical, use no electricity and contain no moving parts. Multiple properties can be measured with one analyser head. NIR analyser is ideal to use in laboratories as well as in plant environments.

For pharmaceutical and chemical industries, NIR analysis provides increased economic benefits, improved tests and analysis on the dock, in the lab and on the process line. NIR testing is completely safe and requires no sample preparation. No harmful waste is created since NIR analysis requires no solvents or reagents. It can reduce the cost of routine testing by more than 50% depending on the application.

How to use it

Before testing, place the instrument on a horizontal, hard and stable surface away from sunlight. The warming up period is about 2 hours. The user should ensure that the temperature of the room where the analyser is stored be 10 to 40 degree C.

Near Infrared (NIR) analyser is used as a cost-effective means of measuring product or process consistency and quality control parameter in organic chemical-related or food industries.

Traditional laboratory methods are cumbersome, expensive and require trained personnel. These methods are too slow to allow sufficient reaction time for control of the process. Now, NIR analysers have replaced most of the traditional methods. These analysers can be placed directly in the production area and can be operated by plant personnel. The analysis time is less than one minute. This eliminates individual analysis on each constituent and saves manpower, training and time.

This article is issue in public interest by Applied Instrument Technologies (AIT) is the process analytical technology business within Hamilton Sundstrand, a United Technologies company. AIT delivers process analytical technology solutions to the leading companies of the world. We design and manufacture robust process development and on-line analyzers for quantitative and qualitative analysis.

1/22/09

Third of four parts: Does Intoxilyzer source code matter?

Criminal defense attorney Jeff Sheridan grew concerned about the reliability of the state’s method for testing drunken drivers when clients kept telling him they tested for alcohol concentrations out of the ballpark for the number of drinks they consumed.

Sheridan watched the state adjust its Intoxilyzer machine in the past due to glitches. So it didn’t surprise him to think the current machine might have problems.

In 2006, he asked a judge for the “source code,” the computer code that controls the Intoxilyzer 5000EN by analyzing a person’s blood-alcohol content and converting the data into a numerical reading. The machine is used to test most of the 30,000-plus drivers arrested each year in Minnesota for drunken driving.

He struck it lucky in a Dakota County courtroom, breaking new ground in the DWI (driving while impaired) law when First Judicial District Court Judge Richard Spicer ordered the state to provide the source code to a drunken-driving defendant, who was challenging the revocation of his driver’s license.

“It was the first time anybody had been asked (for the code),” said Spicer, the first judge in Minnesota to order the source code to be produced. “I think they waited for just the right case.”

What might have been an open-and-shut matter, however, has been prolonged because the state didn’t have the source code and the manufacturer refused to disclose it, citing trade-secret rights. Eagan attorney Jeff Sheridan believes DWI defendants should be allowed to analyze the computer code that controls the state’s Intoxilyzer 5000EN. He says the state never completed a full validation study when it replaced the machine’s software.Jeff Sheridan: Eagan attorney Jeff
Sheridan believes DWI defendants
should be allowed to analyze the
computer code that controls the state’s
Intoxilyzer 5000EN. He says the state
never completed a full validation study
when it replaced the machine’s
software.

As more and more judges followed Spicer’s lead, prosecutors began advising police to switch to blood or urine draws, instead of risking that a judge would throw out breath results.

While everyone waits for the state Supreme Court to decide whether defendants should be provided the source code — and if so, under what conditions — prosecutors and defense attorneys continue to disagree over whether the source code is even relevant.

While law enforcement agencies in Scott County have resumed breath tests, Dakota and Carver counties continue using the more time-consuming blood and urine tests until the Supreme Court issues a decision.

Not surprisingly, prosecutors argue the Intoxilyzer machine has been proven to be effective and defense attorneys are just on a wild goose chase, trying any tactic to get their clients off the hook.

Dakota County Attorney Jim Backstrom, president of the Minnesota County Attorneys Association, calls it an “undocumented fishing expedition aimed at delaying or avoiding justice.”

Criminal attorneys allege there are plenty of reasons to doubt the reliability of the machine, and the computer behind it is key to knowing whether the Intoxilyzer really is failure-proof.

Sheridan sees the matter as one of justice.

“If we’re going to use the machine and only the machine to put people in prison, why can’t we know how it’s going to work? Is there some national security at stake?” Sheridan said. “It seems so ridiculous to even have to ask for this. It’s so clearly discoverable.”

Does it work?

The Intoxilyzer 5000EN relies on infrared technology to determine the alcohol concentration of drivers. Versions of the instrument have been in use in Minnesota since 1984.

When a drunken-driving suspect blows into the machine, an infrared light causes alcohol molecules to vibrate or absorb light at a particular frequency. The difference in light emitted and received is computed to determine the percentage of blood-alcohol content.

Defense attorneys across the country have been requesting Intoxilyzer source codes to see if there are any errors that could be affecting the machines’ reliability.

In New Jersey, a study of the Alcotest 71000, a machine similar to the Intoxilyzer, found more than 19,400 potential errors in the machine’s source code. After a three-year legal battle, however, the state’s Supreme Court upheld use of the machine.

Backstrom said the source code itself is not going to tell anything about the reliability of the machine. “The source code basically converts the English language into machine language so the machine can do its work and come up with documentation for the results,” he said.

Criminal attorneys argue that the source code is important because the computer is the brains behind the Intoxilyzer 5000EN — and those brains are old, relying on an antiquated Z-80 microprocesser.

“The Intoxilyzer is nothing more than a glorified computer,” said defense attorney Sam McCloud of Shakopee. “It doesn’t even use the most up-to-date computer technology — it uses computer technology from the ’80s.”Prosecutors and the state of Minnesota maintain the machine has been tested rigorously by the state. “There are a lot of procedures and safeguards put in place to ensure these tests are accurate,” said Backstrom. “Intoxilyzer machines have been in place and utilized across America for many years. They are extremely reliable machines.”According to a fact sheet put out by the Minnesota Department of Public Safety, the Intoxilyzer was subjected to validation testing before being put to use, including tests that used simulator solutions with a known alcohol concentration, as well as testing on live subjects.“Based on the results of this validation testing, the (Minnesota Bureau of Criminal Apprehension) concluded that the instrument performed properly, yielding accurate and reliable breath-alcohol measurements,” the department said. “Each time the instrument is used, the results are measured against a simulator, which would indicate a potential problem with the instrument, at which time it would be examined and taken out of service, if necessary.”

Each time the machine is used, the suspect provides at least two breath samples and each sample is tested twice. The numbers must be within 0.02 percent of each other and the lowest result is used, if they vary. Simultaneously, the machine tests a solution with a known alcohol concentration.

Enough study?

The Intoxilyzer 5000EN has had its “brains” (two computer silicon chips) replaced three times, said Sheridan, but the state only performed a control study the first time.

That means the computer that currently controls the machine never went through a validation study to compare breath measurements against blood or urine samples taken from the same people.

One of the times the computer was changed was after the state Legislature decided samples taken from defendants had to measure within 0.02 percent of each other in order to be counted, Sheridan said. Before, there was no minimum range.

The machine’s computer was replaced again, Sheridan said, when it was discovered the machine wasn’t kicking out samples outside the 0.02-percent range like it was supposed to.

1/21/09

New Manufacturing Method Results in Non-Cytotoxic Nanorods for In-Vivo Therapeutics and Imaging

A revolutionary new gold nanorod manufacturing method has been developed at Nanopartz where their patent pending gold nanorods are proving their ability to be the future in-vivo nanoparticle for the detection and treatment of solid tumor cancers – without the need for surgery.
This new proprietary method, removes the cytotoxic cetyl trimethylammonium bromide (CTAB) capping agent necessary for manufacturing, replacing it with polyethylene glycol (PEG). PEG has a low toxicity and is used in a variety of products. It is the basis of a number of laxatives and skin creams and many other well known products. The combination of the gold nanorods with PEG coatings, name Ntracker, has shown shown half-life circulation times of greater than 17 hours in mice.

For years researchers have known of the advantages offered by gold nanorods for in-vivo and in-vitro applications. Gold nanorods have unprecedented photothermal absorption characteristics. That is, they efficiently convert light to heat. In addition, gold nanorods are very good light scatterers. These two properties, light scattering and photothermal absorption, are the two properties necessary for successful in-vivo therapeutics, imaging, and diagnostics.

But until now, the limitation on commercializing gold nanorods for in-vivo and in-vitro applications has been toxicity. Nanopartz has now solved this problem with the release of their Ntracker nanorods.
In solid tumor cancer therapy, the gold nanorods can be functionalized with coatings that are specific or even non-directional for solid tumor cancers. Once injected, the nanorods circulate throughout the body, and over time, concentrate in the targeted solid tumor cancer. The nanorods, manufactured to absorb at specific near-infrared wavelengths, readily convert near-infrared light to heat. Since human skin and tissue is fairly transmissive to near-infrared light, a low power near-infrared laser may be used outside the body to heat the solid tumor to temperatures that destroy the tumor cells, not affecting healthy tissue adjacent to the tumor - all without the need for surgery.

As a contrast enhancement agent, gold nanorods may be manufactured to scatter at discrete wavelengths anywhere in the near-infrared. This property lends itself to the ability to enhance detection of solid tumor cancer that might normally be missed using conventional techniques. Currently, researchers are using nanorods as a contrast enhancement for Photoacoustic Imaging, Optical Coherence Tomography, as well as Surface Enhanced Raman Scattering.

Other potential applications of gold nanorods include the use to improve the efficiency of solar cells, for use in negative refractive index materials (i.e. Harry Potter Invisibility Cloak), as well as for optical polarizers for sunglasses.
"We are very excited at the potential applications of these nanorods,” said Christian Schoen, President of Nanopartz.

1/20/09

Method for providing general calibration for near infrared instruments for measurement of blood glucose

A method is disclosed for accurately providing general calibration of near-infrared quantitative analysis instruments for almost any individual user. The general calibration method comprises comparing an individual's near-infrared spectrum to a plurality of near-infrared spectral clusters. Each near-infrared spectral cluster has a set of calibration constants associated therewith. The calibration constants of the spectral cluster most closely associated with the individual spectra are used to custom calibrate the near-infrared analysis measurement instrument.

1/3/09

Boning up on skeletal remains

A fast statistical method for analyzing spectroscopic data has been developed by US researchers to allow crime scene investigators and forensic scientists to more quickly and easily obtain a post-mortem interval on recovered skeletal remains.

The flesh is weak and once it has rotted away, the skeleton remains. However, there are few precise techniques that forensic scientists can use to determine the time since death quickly and easily with only bones to hand. In hot and humid environments the problem is even worse, shortening the time between death and the skeletalization process that renders the body opaque to conventional analytical time-of-death techniques without recourse to major lab-based analysis.

Now, chemist Kenneth Busch, co-director of the Center for Analytical Spectroscopy, at Baylor University, in Waco, Texas, and his team have exploited the fact that bones lose water and the proteins within bones decompose into their constituent amino acids over time. He and his colleagues have now tracked these changes using near-infrared (NIR) reflectance spectroscopy and ultraviolet-visible (UV/Vis) emission and absorption spectroscopy and then applied a statistical regression modelling approach, to correlate the changing spectra with the post-mortem interval (PMI). Their laboratory tests, they say, have an error rate as low as four days for bones that are 90 days old.

"In forensic investigations, establishing the time of death is a key piece of evidence," explains Busch, "Forensic scientists frequently categorize human remains in terms of their post-mortem interval (PMI), which is the time elapsed since a person died." He points out that in areas like Texas that have extreme climates with high heat and high humidity, skeletalization, or "excarnation", of a body happens relatively quickly. "Under these conditions, the determination of the PMI is frequently problematic because of the rapid decomposition of the tissues routinely used to determine PMI," adds Busch.

The researchers used 28 different pig femurs that were up to three-months old and applied the various spectroscopic techniques, which are sensitive to moisture and protein content to obtain data that could be interpreted in terms of PMI. The approach is entirely non-destructive so that they can test any skeletal remains without the need to physically remove samples.

The researchers found that the diffuse reflectance spectra of bones did not follow a straight line pattern as the bones age, so they segmented the data into three sets, which were then used to construct three statistical models of the aging process. They found that this approach could reduce the prediction error still further compared with the original 90-day model. A combination of the two approaches - a discriminant analysis model followed by a segmented regression model gave the optimal results.

"We do it over a certain set of wavelengths, then take all the data from our instrument and put it in a statistics program and analyze it in various ways," explains chemistry graduate student Patricia Diamond, "No one is doing the spectroscopic work we've done."

Busch and colleagues, Diamond (who presented the results), Marianna Busch, and Jody Dogra, revealed details of their technique at the annual meeting of the Federation of Analytical Chemistry and Spectroscopy Societies in October.

"In perfect conditions in the laboratory, the method looks very encouraging," explains Busch. "Once a regression model is built from spectral data, you could find out the age of the bones in a matter of minutes, rather than taking hours or days." He adds that, "Our method isn't absolute - we can just give a range - but once a regression model is built, the time it takes to determine the age of a bone is cut down significantly."


Predictive ability of NIR spectroscopy for pig meat quality evaluation

Inspite of intensive research on fast methods for meat quality evaluation, the practical use of such methods under industrial conditions in meat sector remains scarce. NIR spectroscopy is considered as being one of the most promising techniques.The objective of the present study is to evaluate the predictive value of NIR spectroscopy for pig meat quality evaluation, namely to make and evaluate calibrations for intramuscular fat content determination and technological quality of meat. Meat sampling will be adapted to achieve the necessary variation range for each parameter under study and selection based on the principal factors affecting the trait. Thus, for the intramuscular fat, the selection will be based on variation due to the breed (duroc genes), subcutaneus fat thickness, muscle type, whereas for the technological quality of meat, pigs of three genotypes will be selected (NN, Nn, nn) according to ryr1 gene mutation (alelle n) which is responsible for meat of low technological quality. Measurements of some meat quality traits (pH, water holding capacity, colour) and standard chemical analysis of intramuscular fat content will be performed. The reflectance spectra of all samples will be recorded in visual (408-1092 nm) and near-infrared (1108-2492 nm) spectra using NIRSystem 6500 (FOSS). Prediction ability of NIR spectroscopy for determination of intramuscular fat content will be performed on 140 samples (2 muscles, 70 pigs) and for technological quality of meat on 105 samples (105 pigs, 35 pigs per genotype, 1 muscle). Predictions based on spectral information will be made using WinISI statistical package; for linear data global calibration equations with cross-validation will be used. To discriminate between the genotypes NN, Nn, nn, the artificial neural network classifier will be used, a method available in WinISI statistical package.

Cattle Feeding: Understanding Forage Testing And Analysis

Feeding a balanced diet that meets but not exceeds the nutritional needs of cattle becomes increasingly important as input prices continue to rise. Using averages or �book values� for feeds can result in over- or under-feeding certain nutrients, says University of Nebraska animal scientist Rick Rasby. Feed analysis, he says, can help producers provide more economical and better balanced diets. He offers these tips for testing and analyzing forage and other feeds. Full story:

Analyze feeds for moisture, protein, and energy when designing diets for beef cattle.

Testing labs typically report results on an as-is and dry matter basis. Nutrients should be balanced in a diet on a dry-matter basis because that is the way nutrient requirements for cattle are reported. After formulation on a dry-matter basis, values can be converted to an as-is basis, using the moisture content of the feed, to determine the actual amount of feed to provide.

Sight, smell, and touch are useful for determining stage of maturity at harvest, foreign material or pests, color, leafiness and signs of spoilage. Physical evaluations alone, however, can be misleading and rarely are sufficient for predicting eventual animal performance.

The most common feed analyses use chemical processes to determine nutrient levels. Chemical testing of representative feed samples allows accurate predictions of animal performance because nutrient requirements also were determined using chemically tested feeds.

Near infrared reflectance (NIR) spectroscopy is a rapid, reliable, low-cost, method to analyze feeds for their nutrient content. When sending a sample in to be tested using NIR, be sure to identify the type of feed or forage being submitted. The NIR method will not accurately evaluate a full mineral profile of a sample, but appears to accurately determine calcium and phosphorus.

The NIR method tends to underestimate the energy (TDN) content of distillers� grains, because of the high fat content. NIR will, however, adequately measure moisture, percent crude protein calcium, and phosphorus in distillers� grains. University of Nebraska data suggest that distillers� grains have 125 percent the energy value of corn in forage diets.

Source: Drovers news staff

Process optimization with light energy: implementing NIR helps the pulp industry control chemical consumption

The cost for process chemicals used by the global pulp and paper industry is estimated to reach US$40 billion annually by 2010. In addition, there are escalating pressures to comply with stiffening environmental standards, coupled with the need to maintain quality output without adversely affecting the bottom line.

As a result, more cost-effective and improved process control practices are needed in the paper industry, including implementation of at-line, real-time measurement of the Kappa number and brightness of the chemical pulp.

THE PROBLEM

Kappa number (the index by which the amount of lignin in pulp is measured) and brightness play an important role in optimizing the amount of process chemicals used during the delignification and bleaching processes. They are also valuable parameters, along with pulp moisture content, of the salable finished product.

Current methods of accurately testing both Kappa number and brightness are not only time consuming, but require the use of toxic chemicals and skilled technicians. On average, a paper mill can spend as much as US$115,000 per year on consumable chemicals used for testing, plus an additional US$100,000 per year (in man hours) for sample preparation. Titration, the most common analysis technique for obtaining Kappa number, takes upward of one hour to complete a single analysis, while brightness requires a more complex sample preparation and can take 24 hours or more to complete.

Once the Kappa number and brightness values are known, the amount of process chemicals required for achieving the desired pulp brightness can be determined. Increasing the frequency of in-process measurements of Kappa number and brightness throughout the pulping and bleaching process provides the opportunity to optimize use of delignification and bleaching chemicals, and minimize unnecessary waste and pollutants.

NIR

Near-infrared (NIR) technology has proven to be a trusted and versatile analytical measurement tool in various forestry applications for many years. It is a non-destructive, non-contact materials measurement method that requires little or no sample preparation, and can analyze samples in a matter of seconds. By implementing NIR analysis, process control technicians gain the ability to collect approximately 12 times more data points in an at-line testing environment.

NIR devices use wavelengths from the range of visible light at 350 nm-2,500 nm. Throughout these regions, the light energy interacts with the C-H, N-H, O-H bonds of the sample, producing a "fingerprint" spectrum (Figure 1). This spectrum can then be correlated to the Kappa number and brightness of pulp as determined by the traditional reference chemistry techniques. It is necessary to have a calibration that mathematically relates the spectra measured by the NIR to a reference measurement such as titration. Samples that are represented by the model database will then be accurately predicted.

[FIGURE 1 OMITTED]

Spectral information in the visible region, particularly those around 457 nm, can be correlated to brightness, while the near-infrared region, which contains the chemical information on lignin residuals in pulp, can be correlated to Kappa number.

Once testing points or locations for sampling have been determined, it is necessary to build the calibration model. Model building is essentially creating an equation relating multiple variables to correlate the measured spectra with the desired unknowns such as Kappa number and brightness. This involves calculating the regression equation using the NIR spectra, Kappa number and brightness data collected from the titration and traditional brightness test data. Since reference chemistry methods are expensive and time consuming, it is important to maximize efficiency and create the best possible calibration.

The key challenges in the creation of a calibration are identification of samples to use for the model, obtaining accurate and reproducible reference assays, and the development of the calibration equation using the chemometric modeling tools that combine the spectra and reference data to form the actual calibration model.

It is paramount that the model be representative of the raw material being used. If a mill is using both hard and soft wood, then both wood types must be included in the calibration model. A model must also be validated to ensure its accuracy in predicting Kappa number and brightness. This can be accomplished while building the model simply by holding back a portion of the sample data and then using it for validation. In other words, the majority of sample data is used to build the model, which is then validated by independent testing against the unused samples.

The calibration model must be updated anytime a change occurs in the incoming feedstock, raw materials, processing conditions or instrument set-up. Additionally, periodic validation will help to ensure that the model continues to be accurate. Samples that have a higher error by reference analysis may not be well-represented, or this may be an indication that equipment maintenance is required. These samples are then used in the next version of the calibration model.

...

Google
 

Relate Post