Due to the extremely high specific surface area of nanoparticles and corresponding potential for adsorption, the results of surface analysis can be highly dependent on the history of the particles, particularly regarding sample preparation and storage. The sample preparation method has, therefore, the potential to have a significant influence on the results. This report describes an interlaboratory comparison (ILC) with the aim of assessing which sample preparation methods for ToF-SIMS analysis of nanoparticles provided the most intra- and interlaboratory consistency and the least amount of sample contamination. The BAM reference material BAM-P110 (TiO2 nanoparticles with a mean Feret diameter of 19 nm) was used as a sample representing typical nanoparticles. A total of 11 participants returned ToF-SIMS data, in positive and (optionally) negative polarity, using sample preparation methods of “stick-and-go” as well as optionally “drop-dry” and “spin-coat.” The results showed that the largest sources of variation within the entire data set were caused by adventitious hydrocarbon contamination or insufficient sample coverage, with the spin-coating protocol applied in this ILC showing a tendency toward insufficient sample coverage; the sample preparation method or the participant had a lesser influence on results.
I. INTRODUCTION
Technological advances in materials science have led to the development of nanoscale materials with novel and unique properties. Due to their size, nanoparticles possess particular properties, which can differ greatly from those of bulk materials with a similar chemical composition. The increasing use of these particles (and other nanomaterials) in developing products and technologies, together with their increasing abundance in both consumer products and the environment, requires a comprehensive understanding of the properties of these materials, particularly regarding their chemical and environmental hazards. This understanding is also important for the regulation of nanomaterials. Effective and reproducible methods for the characterization of the most important properties of nanoparticles are therefore needed so that relevant structure–property relationships and behavior in different environments can be determined and/or predicted.
Excluding applications such as nanomedicine where factors such as drug loading is important, the properties of simple nanoparticles can be defined largely by the “three S’s”: size, shape, and surface chemistry.1 The extremely small size of nanoparticles means that their specific surface area is orders of magnitude higher than regular materials, and the measurement of surface chemistry is, therefore, extremely important for their characterization.2
Because of this high specific surface area and corresponding potential for interactions with the environment (for example, adsorption or surface reactions including oxidation), the history of a particular nanoparticle sample can have a disproportionately large influence on its surface chemistry; the importance of recording the sample history has been described previously in ISO standards and related publications.3,4 Production/preparation as well as factors such as storage and handling of the nanoparticles can result in unexpected contamination or degradation of these materials during sample storage,5–10 even when particles are stored appropriately under cool and dark conditions. The evolution over time of nanoparticles’ surface chemistry is still an emerging field, and to date no definitive protocols have been developed.
A. ToF-SIMS and sample preparation
Time of Flight Secondary Ion Mass Spectrometry (ToF-SIMS) is a well-established and highly sensitive method for surface analysis of a wide variety of materials,11,12 with applications ranging from characterization of meteorites13 to analysis of polymer crystallinity14 to mapping of the distribution of nanoparticles in cells.15 More specifically, ToF-SIMS allows a highly sensitive analysis of the upper 3–5 nm of a materials’ surface and is, therefore, a technique of choice for surface analysis of nanomaterials.16,17 The bombardment of the sample with a beam of high-energy primary ions induces a collision cascade, which penetrates only the first few nanometers of the surface11,12,16 and results in the ejection of a variety of species from the sample, including electrons, neutral species, and either positively or negatively charged secondary ions. These secondary ions can then be extracted using an electric field and separated using a Time-of-Flight mass analyzer. State-of-the-art ToF-SIMS instruments can provide a mass resolution (m/Δm) of up to 30 000,18 allowing the accurate resolution of multiple ions with the same nominal masses. ToF-SIMS is also a highly sensitive method able to detect substances in the ppb range. However, due to the fact that the secondary ion yield is not only proportional to the fractional concentration of a particular species but is also affected by other factors such as the matrix surrounding the material, the sputter yield, and the ionization probability of the fragment, ToF-SIMS is at best suitable for semiquantitative analysis.11,12,16 Another important point to underline is that ToF-SIMS analysis is carried out in a UHV environment (p ∼ 10−6 Pa). This fact, together with the high surface sensitivity, requires careful sample handling.
Appropriate sample preparation is an important part of every analytical technique, particularly for surface analysis of nanoparticles, for reasons described earlier. In this case, the main requirements are broadly: (1) securely fixing the nanoparticles so that they are immobilized for consistent and repeatable analysis and are not released causing damage or contamination to the instrument, and (2) optimizing sample preparation to achieve the most consistent results and avoid introduction of contaminants or incomplete substrate coverage, resulting in peaks from other species that are not inherent to the particle as received for testing. As described previously, these sample preparation methods are also useful for the analysis of nanoparticles using other surface analytical methods such as x-ray photoelectron spectroscopy (XPS).19–22
B. Interlaboratory comparison structure and objectives
The Versailles Project on Advanced Materials and Standards (VAMAS) document “Guidelines for the Design and Operation of Interlaboratory Comparisons (ILCs)” released in 2017 describes the goals of and gives guidelines for structuring an ILC.23 These guidelines describe three possible (and mutually exclusive) objectives of an ILC as either: (i) assessing the interlaboratory performance characteristics of new and existing methods, (ii) assessing the performance of laboratories, or (iii) assessing one or more property values of the test material. The present ILC aimed to assess the interlaboratory performance characteristics of new and existing methods, specifically to assess which sample preparation methods for ToF-SIMS analysis of nanoparticles provide the best performance in terms of intra- and interlaboratory consistency, sample contamination, or any other relevant factors.
This study aimed to determine if:
– a particular sample preparation method provided clear advantages over others, in terms of reduced contamination or improved consistency or reproducibility in ToF-SIMS measurements, or
– there were any trends between different participants regarding particular sample preparation methods, or if results were sufficiently comparable with each other.
In most ILCs, a test material with a known “true” value determined from independent analytical methods is provided to participants, who measure a value whose repeatability and trueness/accuracy can be measured and statistically analyzed according to, for example, the ISO 5725 standards covering accuracy (trueness and precision) of measurement methods and results.24–29 However, in the present case, normal statistical metrics for ILCs such as Z- and zeta-scores are not appropriate because the current study of nanoparticle surface contaminants measures a spectrum rather than a single value. The spectra measured by each participant were instead compared to each other using multivariate analysis to determine similarities and differences between various participants and sample preparation methods, based on peak intensity.
In addition, the test materials provided (BAM-P110, 19 nm TiO2 nanoparticles) did not have a specific value for “trueness” of the surface chemistry to which the ToF-SIMS results could be compared. The bulk composition of the particles could theoretically be assessed and used as a baseline; however, this does not necessarily accurately reflect the surface chemistry of the nanoparticles before their exposure to the sample preparation methods described in this article.
Successful ILCs crucially rely on sufficiently homogeneous and stable test materials. A reference material was, therefore, used which was certified for surface area measured according to the BET (Brunauer, Emmett, Teller) method.30 As a certified reference material, the stability of this test material has been proven, but the stability of the surface chemistry (particularly of the surface chemistry of nanoparticles) can only be guaranteed to a limited extent. The stability of nanoparticles under storage is an emerging research area; however, some studies5,31–37 have already reported significant oxidation and other changes in nanoparticle surface chemistry arising from storage under cool dark conditions (e.g., 4 °C protected from sunlight) over time periods as short as 6 months. As yet, no storage protocol has been determined for these particles, which guarantees the stability of their surface chemistry over longer time periods; therefore, the reporting of storage and sample preparation conditions is very important.4 During the course of this ILC, no studies were undertaken to assess the stability of the nanoparticles used; however, the relatively short turnaround time (approximately 3 months) on the analysis should ensure that the levels of storage-induced oxidation were relatively consistent across the various participants and, therefore, should not influence the results. In order to avoid light-induced chemical changes to the nanoparticles, participants were requested to store the samples at room temperature away from sunlight.
C. Principal component analysis (PCA)
PCA is a statistical technique for reducing the dimensionality of large data sets by creating new uncorrelated variables (the principal components, PCs), which maximize the variance in the data.38–43 The goal of PCA is to explain as much of the variance in the data as possible, as simply as possible. The separation of different sample sets on the principal component graph allows the results to be more easily compared and grouped. Simply, PCA returns two sets of values which should be interpreted together: “scores” which demonstrate which samples or sample sets can be differentiated from each other, and “loadings” which demonstrate the causes of these differences (in this case which peaks in the ToF-SIMS spectra are different between samples). In this analysis, the scores are grouped by participant and preparation method, and the loadings are plotted against m/z for each particular peak.
The PCs are listed in order of the amount of variance they explain in the data set. For example, PC1 explains the greatest proportion of variance in the data set, PC2 the next-greatest proportion of variance, and so on. The number of PCs included in the analysis typically ranges from 2 to 4 and can be decided either using a “scree plot,” in which the percentage of variance explained is plotted against the PCs; where there is a “knee” in the graph (and inclusion of further PCs brings diminishing returns), or when a given proportion of the variation in the data (such as 90%) is explained. In this work, the number of PCs was chosen using the “knee” approach, shown in Fig. 1.
Example of a scree plot (from the PCA in positive polarity, all participants); in this case, the “knee” can be found at the third or fifth principal component.
Example of a scree plot (from the PCA in positive polarity, all participants); in this case, the “knee” can be found at the third or fifth principal component.
For further details and a deeper mathematical explanation, the reader is referred to the literature.38–43
II. EXPERIMENT
A. Participants
Participants were selected via open invitation44 and voluntary participation in the experiment. From a total of 14 participants who registered interest, 11 participants returned data. Nine of these participants measured using instruments from the same manufacturer (IONTOF GmbH). Two other participants returned data using instruments from other manufacturers (ULVAC-PHI and IONOPTIKA). Further details can be found in the supplementary information.45
B. Sample preparation
1. Preparation of samples for the ILC
1 g of BAM-P110 TiO2 nanoparticles30,46 (mean Feret diameter of 19 mm) in powder form, contained in glass vials protected from light, were sent to participants in November/December 2020. Before sending, these nanoparticles were stored at room temperature out of direct sunlight.
The test materials were homogenized as much as possible by manually tumbling the container (inverting alternately in x or y directions) 30 times at a rate of approximately 1 tumble/2 s before being scooped from the container and separated into samples sent to participants. The materials were sampled by scooping and the container additionally tumbled 10 times between sampling for each participant. Because the particles are a homogeneous reference material with a relatively narrow size distribution and were being analyzed for surface chemistry rather than particle size, the method used for taking samples to send to participants focused on minimizing handling, rather than using common powder sampling techniques described in literature and relevant ISO norms.47–49 Due to possible respiratory hazards, the nanoparticles were handled under a fume hood, which compared to a clean room may introduce some contaminants from the atmosphere because of the direction of air flow. Samples for participants were prepared on aluminum foil surfaces and using utensils that had been cleaned with HPLC-grade solvents (ethanol and isopropanol) from Sigma-Aldrich.
2. Sample preparation guidelines for participants
All participants were sent guidelines for sample preparation, measurement, and data analysis, which were based on those described in the literature.19 The following instructions were sent to participants:
a. Preparation of BAMP110 from powder: “Stick-and-go” (SG)
Materials
BAM-P110 nanoparticles (∼0.1 g)
Si-free double-sided adhesive support. We recommend either 3M Removable Repositionable Tape 665, or carbon adhesive tapes used in SEM, e.g., “Leit-Tabs” G3347 from Plano. (Release liners for adhesives commonly contain polysiloxanes, which can give erroneous signals.)
Sample support for SIMS instrument (a 1 cm2 square of Si wafer is also appropriate)
Glass sample slides or Al foil
ACS-grade isopropanol and laboratory wipes
High-purity N2 or compressed air stream
Method (a)
Thoroughly clean all tools and surfaces with isopropanol and laboratory wipes.
Fix the double-sided adhesive to the sample holder and remove the liner.
Take a spatula-tip of the nanoparticle powder and tip it onto the adhesive.
Spread the sample over the adhesive and press into the adhesive with the spatula, until as much of the powder is adhered as possible.
Check that the powder is fixed on the tap by inverting and tapping the sample holder, then blowing a stream of gas across it while inverted.
Three separate samples should be prepared, and each measured in three different spots.
Method (b)
Clean a glass slide or aluminum foil with an isopropanol-soaked laboratory wipe (e.g., by wiping in a single direction only) and allow the isopropanol to evaporate for 20 min under ambient conditions.
Place a spatula-tip of the powder onto the cleaned surface.
Fix the double-sided tape to the sample support and remove the liner.
Press the adhesive with the sample holder firmly onto the powder.
Check that the powder is fixed on the tap by inverting and tapping the sample holder, then blowing a stream of gas across it while inverted.
Three separate samples should be prepared, and each measured in three different spots.
b. Preparation of BAMP110 suspension
Materials
BAM-P110 nanoparticles (∼15 mg)
10 ml centrifuge tubes
Ultrapure water
Vortexer
Method
Accurately weigh 15 mg of nanoparticle powder into a 10 ml centrifuge tube.
Accurately weigh 8 g of ultrapure water.
Close the tube and vortex at 3000 rpm for 15 min.
Samples may be prepared using either the drop-cast or spin-coating method.
Note: Due to sedimentation occurring over time, it is recommended to cast the samples quickly after vortexing.
c. Sample preparation using “drop-dry” (DD) method
Materials
Precleaned silicon wafers
Viton O-ring (6.07 × 1.78 mm)
Nanoparticle suspension
Wafer holder, e.g., 25 mm coin style
Desiccator and vacuum line
Eppendorf pipette and tips
Method
Place the wafer in one half of the wafer holder and place a 3 μl drop of the suspension in the center.
Mount the Viton O-ring on the wafer around the droplet. Take care that the O-ring does not touch the droplet.
Place the wafer in the desiccator and apply vacuum until the droplet has dried.
Repeat until a closed and homogeneous layer is obtained. Complete coverage can be verified using optical microscopy or other suitable methods.
Three separate samples should be cast, and each measured in three different spots.
d. Method 2: spin-coating (SC) from aqueous solution
Materials
Precleaned silicon wafers
Spin-coater
Eppendorf pipette and tips
Method
Program the spin-coater. A sample program we used is as follows: step 1: 500 rpm/s ramp to 1000 rpm (5 s); step 2: 1000 rpm/s ramp to 2000 rpm (3 min); step 3: deceleration at 2000 rpm/s to 0 rpm (please adjust as necessary).
Place the wafer in the spin-coater and fix using vacuum.
Deposit 80 μl of the suspension on the wafer and start the program.
Remove when finished.
Confirm complete coverage of the substrate using SEM.
Three separate samples should be cast, and each measured in three different spots.
Participants were requested to return a detailed description of the sample preparation used, which is described in detail in the supplementary information.45 Participants were requested to return ToF-SIMS results at least for samples prepared using the SG method and measured in positive mode, with the option of using other methods and measuring in negative mode as desired. Table I summarizes the data returned from participants in the study.
Data returned from study participants using both positive (+) and negative (−) polarity, using various preparation methods and data formats. Bold: participants using IONTOF instruments; italics: participants using instruments from other manufacturers.
Method . | Participant . | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
A . | B . | C . | D . | E . | F . | G . | H . | I . | J . | K . | . | |
DD | +/− | +/− | +/− | +/− | + | +/− | + | |||||
SC | +/− | +/− | +/− | – | ||||||||
SG | +/− | +/(−)a | +/− | +/− | +/− | +/− | +/− | + | + | +/− |
Method . | Participant . | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
A . | B . | C . | D . | E . | F . | G . | H . | I . | J . | K . | . | |
DD | +/− | +/− | +/− | +/− | + | +/− | + | |||||
SC | +/− | +/− | +/− | – | ||||||||
SG | +/− | +/(−)a | +/− | +/− | +/− | +/− | +/− | + | + | +/− |
Data were returned but not usable.
3. ToF-SIMS measurements and data return
Participants were requested to measure the samples using static SIMS in spectrometry mode, using Bi3+ as primary ions where possible (not available on all instruments). Further, participants were also requested to report in detail their sample preparation and measurement procedures; these are included (where returned) in the supplementary information.45 Data were returned as IONTOF files and uploaded to a collective online folder. The data were collated and analyzed simultaneously.
Data from the two participants with non-IONTOF instruments were returned either as a table of peak areas in a Microsoft Excel spreadsheet, based on a peaklist sent out with the sample preparation guidelines, or as data on the native instrument files along with the corresponding software for analysis. For this reason, a comparison of the data across all participants was limited to the analysis of the peaks provided in the Excel table. However, given that a deeper analysis may yield more detailed results, an in-depth analysis was conducted using data from participants using IONTOF instruments, which could be re-analyzed on the SurfaceLab 6.8 software (IONTOF GmbH). Coincidentally, only participants using IONTOF instruments returned data in negative polarity.
4. Data processing and PCA
The returned samples were analyzed collectively (according to polarity, either positive or negative). Spectra were calibrated up to m/z = 223.8 (Ti3O5+) or m/z = 258.8 (Ti4O4H2−); mass interval lists (i.e., peaklists) were produced covering all significant peaks of all spectra, up to the maximum calibration range. Peaks were allocated according to expected materials, contaminants, and peaks from substrates.
Data were manually preprocessed using Excel as follows:
– Normalized to the total sum of peak areas (for each spectrum)
– Divided by the square root of the mean peak area (for each peak)
– Mean-centered across the peak area (for each peak)
PCA was performed using the open-source software R (Ref. 50) using the prcomp function. The R function prcomp computes the PCA using singular value decomposition, also known as Q-Mode-PCA.52 This approach is able to deal with high dimensionality and low sample size data, unlike the more traditional approach of the so-called R-Mode-PCA, which uses spectral decomposition of the covariance matrix of the data (in R this would be the princomp function).51 The scores and loadings for each principal component were then normalized to unit variance and graphed in Origin.
In summary, three different PCAs were performed in this study:
– Data from all participants, positive polarity: enabled comparison of ToF-SIMS spectra across all participants, but with a reduced peaklist containing peaks common to all data returned;
– Data from participants using IONTOF instruments, positive polarity: allowed a comparison of more detailed ToF-SIMS spectra than the original peaklist;
– Data from all participants who returned data in negative polarity.
C. Challenges
Due to the unusual nature of this ILC in not measuring the variation of measurement results compared to a given “true” value, the analysis of the results allows only comparison between different participants and sample preparation methods. The return of data in different formats was a major challenge in this analysis, which is the reason that two different PCAs were undertaken for the results in positive mode.
The software available for ToF-SIMS spectral analysis at the primary authors’ institutes is surfacelab 6.8 (IONTOF GmbH). Since most of the participants measured using IONTOF instruments, most of the data returned could be analyzed together using this software. Two participants (H and I) returned data measured on instruments from other manufacturers, in other data formats. Participant I returned data as a Microsoft Excel table of peak areas, while participant H returned their measured spectra together with their analysis software. The comparison of data from all participants naturally required a common peaklist consisting of peaks present in all of the returned data sets, which meant a more restricted list of peaks to compare. However, a more detailed analysis containing a more extensive peaklist may provide further information and insights; therefore, a second PCA from data in positive mode was conducted including only participants using IONTOF instruments (A, B, C, D, E, F, G, J, and K). Coincidentally, only participants using IONTOF instruments returned data in negative mode, so only one analysis was necessary.
D. Data presentation
Due to the large number of spectra submitted by the participants and variables related to the peaks in this study, it was not possible to label each peak or point in the pseudo-m/z plot in which the loadings were shown. Instead, each peak in the spectra was allocated to a particular species which was grouped based on its likely source, for example, TiO2-based species, hydrocarbons, inorganic species, Si-based species (from the Si wafer substrate), siloxane-based species (potentially residues from the adhesive used in SG), and so on. This means that, for example, all peaks of TiO2-related secondary ions (46Ti+, 47Ti+, Ti+, TiO+, Ti2O3+, etc.) are included in the group named “TiO” and have the same color and symbol on the graph.
Comparison of score and loading plots for each principal component shows which measured samples differ from each other and which peaks in the mass spectrum are responsible for this difference. It was expected from this study that the results could be easily differentiated either by the preparation method or by laboratory, giving an indication of which preparation method (if any) showed the lowest amount of variability and/or contamination, and which laboratories (if any) deviated from the group. Expected sources of difference center on various types of contamination that could potentially have been introduced during sample preparation; because nanoparticles have such an extremely high specific surface area compared to their volume, they are extremely likely to absorb contaminants during transport, storage, and handling. In fact, adsorbed hydrocarbons (both aliphatic and aromatic) are commonly found in ToF-SIMS spectra and likely originate from adventitious adsorption from the laboratory environment and are difficult to avoid. Other contaminants may be transferred from substrates or during sample preparation procedures, or alternatively peaks from the substrate may appear in the analysis due to incomplete sample coverage.
III. RESULTS AND DISCUSSION
A. All participants, positive polarity
As described earlier, due to the differences in data formats, the comparison of data from all participants was restricted to the peaks provided by the participants returning data in an Excel table. Nonetheless, this peaklist provided a good distribution of secondary ions from various materials in the sample (i.e., the TiO2 nanoparticles, hydrocarbon contaminants, inorganic contaminants, and species from Si-wafer substrates) to enable a sufficiently detailed PCA. Based on the scree plot shown as an example in Fig. 1, the first three PCs are analyzed, which together explain 66% of the total variance within the sample set.
The first question to examine is if clear differences can be seen between the mass spectra of samples using different preparation methods. Figure 2 shows the score and loading plots for PC1, (a) grouped by preparation method and (b) separated by preparation method and participant. The loading plot [Fig. 2(b)] shows a clear separation between TiO2-related peaks (negative loadings) and peaks from other sources (mainly positive loadings). This indicates that in this analysis, PC1 separates the samples showing strong TiO2-related peaks from those showing peaks from other sources, which is responsible for 38% of all variance in the total sample set; in other words, PC1 describes the level of general contamination in the samples. The other peaks are allocated to species from various sources, and in order to simplify data analysis and visualization have been grouped by color according to their allocation. These groupings are summarized in Table II.
Comparison of score and loading plots for PC1 (38.16%) in positive polarity (all participants): (a) scores grouped by preparation method, (b) scores grouped by participant and preparation method, and (c) loadings grouped by species allocation.
Comparison of score and loading plots for PC1 (38.16%) in positive polarity (all participants): (a) scores grouped by preparation method, (b) scores grouped by participant and preparation method, and (c) loadings grouped by species allocation.
Grouping of peak allocations for all PCAs in this paper, in both positive and negative modes.
Group . | Description . | Examples of peaks included . |
---|---|---|
OH | OH species | H3O+, O−, OH−, O2H−, H2O2− |
CH | Hydrocarbons | C2H3+, C8H9+, C3H+, C9H7+, C−, CH−, C2H3−, C3H5−, C4H3− |
OCH | O-containing hydrocarbons | CH3O+, C2H3O+, C3H5O+, C3HO3+, C2H3O−, C2HO2−, C4H5O−, C7H13O2− |
CHON | O- and N-containing hydrocarbons | CH2N+, CH5N+, C2H4N+, C3H4N+, NH−, CHN−, CNO−, C3NO− |
SiOx | Si, O, H-containing species | Si+, SiHO+, SiH2O2+, Si2O+, Si3H3O7+, Si−, 29Si−, SiH−, SiHO−, SiO2−, SiHO3− |
SiCH | Si-containing hydrocarbons | SiC2H+, SiC2H7+, SiC3H9+, SiCH−, SiCH3− |
SiCONS | Si-containing hydrocarbons also containing S, N | SiCHO−, Si3H4S−, Si3C2N−, Si2C2SN− |
Inorg | Inorganic species | Al+, Ca+, Fe+, K2PO2+, Cl−, AlO−, SO2−, KSOH−, P3H2O9− |
Inorg-CH | Inorganic-containing hydrocarbons | CH3OCl+, C3H3Na+, C3Cl2−, CCl3− |
CF | C and F-containing species | CF+, C3F+, C2OF+, F−, CF−, HF2−, CHF2−, C2F5− |
TiO | TiO-related species | 46Ti+, Ti+, TiO+, TiO2H+, 46TiO−, TiO−, TiH2O−, TiH3O2−, Ti3O6− |
TiCH | Ti-containing hydrocarbons | CTi+, CHTi+, CH3Ti+, C2H2Ti+ |
TiFS | Ti, F, and S-containing species | 47TiS−, OTiF−, TiF2−, CH2STi−, F3Ti− |
Group . | Description . | Examples of peaks included . |
---|---|---|
OH | OH species | H3O+, O−, OH−, O2H−, H2O2− |
CH | Hydrocarbons | C2H3+, C8H9+, C3H+, C9H7+, C−, CH−, C2H3−, C3H5−, C4H3− |
OCH | O-containing hydrocarbons | CH3O+, C2H3O+, C3H5O+, C3HO3+, C2H3O−, C2HO2−, C4H5O−, C7H13O2− |
CHON | O- and N-containing hydrocarbons | CH2N+, CH5N+, C2H4N+, C3H4N+, NH−, CHN−, CNO−, C3NO− |
SiOx | Si, O, H-containing species | Si+, SiHO+, SiH2O2+, Si2O+, Si3H3O7+, Si−, 29Si−, SiH−, SiHO−, SiO2−, SiHO3− |
SiCH | Si-containing hydrocarbons | SiC2H+, SiC2H7+, SiC3H9+, SiCH−, SiCH3− |
SiCONS | Si-containing hydrocarbons also containing S, N | SiCHO−, Si3H4S−, Si3C2N−, Si2C2SN− |
Inorg | Inorganic species | Al+, Ca+, Fe+, K2PO2+, Cl−, AlO−, SO2−, KSOH−, P3H2O9− |
Inorg-CH | Inorganic-containing hydrocarbons | CH3OCl+, C3H3Na+, C3Cl2−, CCl3− |
CF | C and F-containing species | CF+, C3F+, C2OF+, F−, CF−, HF2−, CHF2−, C2F5− |
TiO | TiO-related species | 46Ti+, Ti+, TiO+, TiO2H+, 46TiO−, TiO−, TiH2O−, TiH3O2−, Ti3O6− |
TiCH | Ti-containing hydrocarbons | CTi+, CHTi+, CH3Ti+, C2H2Ti+ |
TiFS | Ti, F, and S-containing species | 47TiS−, OTiF−, TiF2−, CH2STi−, F3Ti− |
Peaks with positive loadings may occur from a variety of sources. The CH, OCH, and to a lesser extent Inorg-CH groups described in Table II may plausibly occur from the spontaneous adsorption of adventitious hydrocarbons in the laboratory environment onto the surface of the nanoparticles.52,53 Si-related peaks can occur from the silicon wafer substrate; however, another possible source is silicone residues from the release liner used for the double-sided adhesives. Despite the use of high-purity MilliQ water, a number of inorganic ions are commonly present, while various metals may be present as contaminants from surfaces or implements used in sample preparation (e.g., aluminum). Due to the collision cascade during ToF-SIMS measurements, species such as SiCH, TiCH, and Inorg-CH may appear in the spectrum, despite the fact that they are not covalently bonded in the sample.
In summary, the loading plot separates clean samples from those with various sources of contamination. Comparing this to the score plot in Fig. 2(a), there is no clear distinction between different sample preparation methods showing more or less contamination according to the preparation method as might be expected, i.e., none of the preparation methods were clearly better or worse for introducing contaminants to the sample. One plausible explanation is that the level of contamination varies with the participants and their laboratories, rather than the preparation method. Contaminants from samples prepared using SG are likely to have been introduced before the sample was fixed onto the substrate, for example, from the storage environment or tools used to prepare the sample. The choice of tape may also play a role; some double-sided adhesives can introduce contaminants onto the samples from polysiloxane coatings on the release liners. For samples prepared from wet preparation methods, contamination from MilliQ water (even when prepared to specifications) is the most probable source. Each laboratory should establish its own best practice for minimizing contamination.
The same data grouped by both preparation method and participant, shown in Fig. 2(b), also do not show any trends by laboratories as may be expected. Instead, some laboratories such as A, C, and E who returned data from multiple preparation methods showed a much larger scatter in their PC1 scores than other laboratories such as I and J. This indicates a further inconsistency in the degree of contamination introduced by sample preparation across different methods and within one laboratory, indicating that some participants were able to perform some methods more consistently than others. Skill in working cleanly in a particular technique can potentially vary greatly across different operators within a laboratory or may be related to the purity of reagents such as MilliQ water or the cleanliness of equipment. The interlaboratory repeatability of the results, therefore, depends mainly on the laboratory’s best practice or experience rather than the method.
ILCs are commonly used to evaluate both within-laboratory and within-method variability. In this case, this could potentially be calculated using the standard deviation of scores for each PC, for each preparation method, or for each laboratory. A quick glance at Figs. 2(a) and 2(b), however, shows that this approach would not make sense; the large amount of variation in PC scores within a laboratory but across different methods (or vice versa) would not yield meaningful results. An alternative approach would be to plot the standard deviation of PC scores against participant/method, as shown in Fig. S5 in the supplementary information,45 which shows that no single method performs significantly better than others with respect to general contamination (PC1). In this way, the different methods can be easily compared to each other for consistency, as well as any stand-out participants identified. The complete set of score and loading plots for each PCA, as well as scree plots and standard deviations of scores, can be found in the supplementary information.45
PC2 describes the second-largest source of variation in the samples, as shown in the score and loading plots in Fig. 3. An analysis of the loading plot indicates that PC2 separates the nanoparticles and their associated contaminants from other peaks, including Si-related, inorganic, and OCH-containing contaminants. These peaks are all consistent with a Si wafer substrate, cleaned (as described in the supplementary information45) with ethanol, isopropanol, and MilliQ water. This separation suggests that PC2 describes the degree of substrate coverage. In this case, all samples prepared using the SC method showed a higher incidence of these substrate-related peaks, and a lower amount of TiO2 and CH-related peaks, indicating increased gaps in the substrate coverage. Based on these results, spin-coating is not recommended as an optimal sample preparation method. In some cases, the DD and SG methods also showed results corresponding to substrate- and cleaning-related species. Incomplete sample coverage is also possible for DD and SG samples; however, the SG samples would be expected to show a higher number of hydrocarbon peaks from the pressure-sensitive adhesive (most probably acrylic-based) on which the sample is mounted.
Comparison of score and loading plots for PC2 (18.15%) in positive polarity (all participants): (a) scores grouped by participant and preparation method and (b) loadings grouped by peak allocation.
Comparison of score and loading plots for PC2 (18.15%) in positive polarity (all participants): (a) scores grouped by participant and preparation method and (b) loadings grouped by peak allocation.
In summary, the largest source of variance across all samples is general contamination of the nanoparticles, which does not correspond to any one particular preparation method. The second-largest source of variance is insufficient sample coverage, which occurs for all spin-coated samples, as well as some samples prepared according to the DD and SG methods.
B. Positive polarity, participants using IONTOF instruments
Due to the restricted number of peaks available to the “all-participants” analysis, a second more detailed analysis was performed using the data from participants using IONTOF instruments and data returned in the SurfaceLab file format. Because of the common data format, a much more in-depth analysis and peak selection can be performed and, for example, the Si-related peaks arising from lack of Si-wafer coverage or polysiloxane contamination could potentially be differentiated.
Figure 4 shows the score and loading plot from PC1 of this data set. Comparison with Fig. 2 (PC1 in positive polarity, all participants) shows that this graph is essentially inverted. The orientation of positive/negative axes is arbitrary; the important point is which scores correspond to which loadings. The expanded list of peaks analyzed, however, still gives a similar result to the analysis for all participants; PC1 separates TiO2-related peaks from various types of contaminants and indicates how cleanly a particular experiment was performed. Again, there is no trend of increased contamination by preparation method, although the sample sets from participant A (SC), participant B (SG), and participant C (DD and SC) have distinctly higher contamination (and/or less surface coverage, considering the results from PC2) than average.
Comparison of score and loading plots for PC1 (36.27%) in positive polarity (participants with IONTOF instruments): (a) scores grouped by participant and preparation method and (b) loadings grouped by peak allocation.
Comparison of score and loading plots for PC1 (36.27%) in positive polarity (participants with IONTOF instruments): (a) scores grouped by participant and preparation method and (b) loadings grouped by peak allocation.
PC2 (Fig. S13 in the supplementary information45) also shows a similar result to the all-participants analysis presented in Fig. 4; PC2 separates TiO2-related peaks and their adsorbed hydrocarbon contaminants, from SiOx-related peaks and peaks from cleaned Si wafers. Two significant differences compared to PC2 from the all-participants analysis are the presence of CHON (N- and O-containing hydrocarbons), having negative loadings, and the higher molecular weight TiO2-related peaks, having loadings closer to zero. The CHON peaks may arise from adventitious contamination of the Si wafer from organic substances during the cleaning or coating process. Alternatively, PC2 may also differentiate hydrocarbon and CHON contamination patterns on the nanoparticles. The outstandingly high positive score of the SG experiment from participant A (Fig. 3) is probably caused by a larger than the normal level of adventitious hydrocarbon contamination on the nanoparticles.
PC3 (Fig. S15 in the supplementary information,45 accounting for the next 11.25% of the variance) begins to separate out single participants. From the score plot, participant B clearly shows a significant difference from all other participants. Comparing this to the loading plot, there are no characteristic groups of peaks which stand out as being caused by one particular contaminant. Visual comparison of the spectra from participant B showed no stand-out aberrations compared to spectra from other participants. Discussion with the participant revealed no particular aberrations in the sample preparation; however, one aperture in the instrument was partially eroded which may have affected the results. This suggests that instrumental parameters may also influence the spectra, but not as strongly as sample preparation.
PC4 (Fig. S17 in the supplementary information45), while relevant according to the scree plot, does not show any clear separation according to the sample preparation method, participant, or peak allocation.
C. Negative polarity
Measuring data in negative polarity was an optional component of the ILC. Six participants, all using IONTOF instruments, returned data in negative polarity. PCA was performed as described previously.
Figures 5 and 6 show the first two principal components in the negative polarity. In this case, PC1 describes the level of substrate coverage (based on the comparison of Si- and SiO-related peaks to TiO2-related peaks and their associated contaminants), and PC2 describes the amount of hydrocarbon contamination on the samples. Similar to positive polarity, spin-coating shows a consistently higher abundance of Si- and SiO-related peaks, which corresponds to poorer sample coverage.
Comparison of score and loading plots for PC1 (33.00%) in negative polarity: (a) scores grouped by participant and preparation method and (b) loadings grouped by peak allocation.
Comparison of score and loading plots for PC1 (33.00%) in negative polarity: (a) scores grouped by participant and preparation method and (b) loadings grouped by peak allocation.
Comparison of score and loading plots for PC2 (24.35%) in negative polarity: (a) scores grouped by participant and preparation method and (b) loadings grouped by peak allocation.
Comparison of score and loading plots for PC2 (24.35%) in negative polarity: (a) scores grouped by participant and preparation method and (b) loadings grouped by peak allocation.
The reversal of surface coverage and hydrocarbon contamination as the largest source of variation in the samples may be due to differences in the probability of formation of positive or negative ions from different species; for example, adsorbed hydrocarbons on TiO2 nanoparticles show a strong and distinctive peak pattern in positive polarity, and fewer prominent peaks in negative polarity.
The large amount of scatter for participant G in Fig. 6 is noteworthy in this PC, indicating a large amount of within-laboratory variation in sample contamination. The fact that this corresponds to CH and OCH-related peaks suggests some localized hydrocarbon contamination on the sample.
In PC3 (Fig. 7), individual participants again begin to separate out. In this case, participant D shows strong signals from inorganic contaminants (mainly related to chlorine), as well as some TiO2- and TiCH-related peaks. This participant prepared samples using both a wet (DD) and a dry (SG) method, so it is unlikely that these peaks are contaminants from expected sources such as Milli-Q water. In PC3, the individual laboratories start to group together by scores, indicating that PC3 accounts for some kind of interlaboratory variable, either with sources of contamination, instrument parameters, or (theoretically) inhomogeneities between the samples supplied. This separation by participant can also be seen to a lesser extent in PC3 from the PCA in positive polarity from participants using IONTOF instruments (Fig. S15 in the supplementary information45).
Comparison of score and loading plots for PC3 (10.26%) in negative polarity: (a) scores grouped by participant and preparation method and (b) loadings grouped by peak allocation.
Comparison of score and loading plots for PC3 (10.26%) in negative polarity: (a) scores grouped by participant and preparation method and (b) loadings grouped by peak allocation.
D. Discussion
Sample preparation is a critical component of surface analysis of nanoparticles due to the necessity for adequate mounting of the nanoparticles with unbroken substrate coverage as well as minimizing contaminants that adsorb onto the surface of the nanoparticles. It was expected that one or more sample preparation methods may clearly stand out as better or worse for surface analysis or that one or more laboratories would stand out from the group.
The main sources of variation within the results were adventitious (mainly) hydrocarbon contaminants adsorbed onto the surface of the particles, and insufficient coverage of the substrate. Between these two factors, the primary source of variation varied between analyses in positive and negative polarity; this is most likely influenced by particular species being more likely to form positive or negative ions, respectively, and therefore giving stronger peaks in the ToF-SIMS spectrum.
Due to the very large specific surface area of nanoparticles, some degree of adsorption of organic compounds is unavoidable; ideal sample preparation will ideally minimize this contamination (for example, through handling in clean environments or under inert atmospheres), or at least ensure it is consistent and known. The results of this ILC show that none of the methods tested is a clear stand out as optimal for minimizing contamination; the samples did not separate in the score plots as cleanly by different preparation methods as might be expected, which indicates that certain types of contamination are not necessarily inherent to particular sample preparation procedures. Additionally, no single participant stood out in the analysis; the level of contamination or surface coverage often varied strongly between the different sample preparation methods used by an individual participant. This suggests that each participant has some methods that they perform better than others. We recommend that each laboratory uses suitable reference materials to assess their preparation methods and the potential influence of contaminations on their results.
Spin-coating as a preparation method in this study showed some (although not absolute) tendency toward incomplete sample coverage; however, this may also be influenced by factors such as the dispersion protocol and particle concentration. If this method is used, it is recommended that care should be taken to ensure complete coverage using other analytical methods such as microscopy or XPS.19
The lack of a clear best method for minimizing contamination has the disadvantage that an optimal sample preparation procedure for ToF-SIMS analysis of nanoparticles could not be determined, which hinders efforts in the standardization of this method. On the other hand, having the option of a variety of suitable sample preparation methods is an advantage depending on the availability of laboratory equipment (e.g., spin-coaters or high-quality ultrapure water) or the state in which the nanoparticles are available (powder or dispersion), particularly as different sample preparation methods have been shown in previous work19 to influence various properties of the nanoparticles including damaging sensitive coatings.
Individual participants only begin to stand out in the third and fourth principal components, and in the data seen here do not correlate with any particular set of contaminants or other peaks; this separation may be caused by particular factors unique to the laboratory-specific ToF-SIMS measurement procedure. Instrumental conditions, however, are in no case the main sources of variation between results, even in the data set which compares measurements from different instruments. This is a clear positive result for the ToF-SIMS community in general.
Due to this lack of separation, as well as the high difference in variation among different measurements within the same laboratory, an optimal sample preparation method is ultimately a decision that should be made and confirmed by each laboratory via testing; no “stand-out” best method could be determined from this study. This means that there is a great deal of flexibility of choice in the sample preparation method used, particularly if nanoparticles are, for example, already present in a suspension and cannot be prepared using the SG method. While the variation between laboratories for particular methods could be a cause for concern, the main cause of difference being sample contamination means that there is also scope for optimization within each laboratory. Each laboratory should, therefore, establish and test its own protocols for reducing contamination and guaranteeing the best coverage of the sample on the substrate. Suitable reference materials should therefore be used. Due to its extremely high surface sensitivity, contamination effects should be considered in the interpretation of ToF-SIMS spectra of nanomaterials. For publication, it may be desirable to show data from suitable reference or test materials showing contaminants or substrate peaks and their effect on the results.
The approach used in this study may be used to optimize best-practice sample preparation methods within a particular laboratory, for example, to ensure consistency between operators or as a quality control over time to ensure consistency of operators. Optimizing the consistency and reliability of ToF-SIMS, including sample preparation methods, further supports its use as a strong and reliable method for the analysis of nano- and advanced materials.
IV. SUMMARY AND CONCLUSIONS
Three methods (two wet and one dry) were compared for the preparation of TiO2 nanoparticles for surface analysis using ToF-SIMS. No sample preparation method stood out as clearly superior to the others, which is a disadvantage for method standardization but has the advantage that suitable methods are available for both nanoparticle powders and suspensions. No single participant also stood out in the main sources of variance, even when instruments from other manufacturers were used, which is a positive result for the ToF-SIMS community. This study provides a basis for the development of best-practice methods for the preparation of nanoparticles for surface analysis, which should be developed and monitored in each individual laboratory.
ACKNOWLEDGMENT
The authors acknowledge funding from the European Union Horizon 2020 Project ACEnano (Analytical and Characterization Excellence in nanomaterial risk assessment: A tiered approach, Grant Agreement No. 720952).
AUTHOR DECLARATIONS
Conflict of Interest
The authors have no conflicts to disclose.
Author Contributions
Francesca Bennet: Formal analysis (equal); Investigation (lead); Methodology (lead); Validation (supporting); Visualization (lead); Writing – original draft (lead); Writing – review & editing (lead). Robert Optiz: Formal analysis (equal); Investigation (supporting); Software (equal); Validation (supporting); Writing – review & editing (supporting). Narges Ghoreishi: Formal analysis (equal); Investigation (supporting); Software (equal); Validation (supporting); Writing – review & editing (supporting). Kristina Plate: Formal analysis (equal); Investigation (supporting); Software (equal); Validation (supporting); Writing – review & editing (supporting). Jean-Paul Barnes: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Allen Bellew: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Anna Belu: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Giacomo Ceccone: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Eric de Vito: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Arnaud Delcorte: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Alexis Franquet: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Francesco Fumagalli: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Douglas Gilliland: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Harald Jungnickel: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Tae Geol Lee: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Claude Poleunis: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Derk Rading: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Hyun Kyong Shon: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Valentina Spampinato: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Jin Gyeong Son: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Fuyi Wang: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Yung-Chen Andrew Wang: Investigation (supporting); Validation (supporting); Writing – review & editing (supporting). Yao Zhao: Investigation (equal); Validation (supporting); Writing – review & editing (supporting). Alexander Roloff: Resources (equal); Supervision (supporting); Writing – review & editing (supporting). Jutta Tentschert: Conceptualization (equal); Methodology (supporting); Writing – review & editing (supporting). Jörg Radnik: Conceptualization (equal); Methodology (supporting); Project administration (lead); Supervision (lead); Writing – review & editing (supporting).
DATA AVAILABILITY
The data that support the findings of this study are available from the corresponding author upon reasonable request.