yuanpeng / Fe45Cr_Notes.md
Dernière activité
Texture Correction for Fe45Cr
-
It seems this sample is with stronger texture effect, causing weird behavior in the high angle region. If you grab the files ending with "_corrected_bkg_rm_fittedbkg.dat", you will see the input data for the second stage correction. Those files are the scattering patterns for each of the 2Theta band after the first stage correction and background subtraction. For the background subtraction, we fitted an exponential function with the form of 'a + b * exp[-c * Q^2]' and after the subtraction, we expect the pattern sitting on the baseline, i.e., 0. Here below I am showing some of the patterns with the background subtracted, from which you can see the weird effect in the patterns for 157 and 162. For the two regions labeled out with red arrows, there are no peaks at all for 157 and 162, in which case there is no hope for any corrections and therefore we have to remove them. To do this, we can either delete all files corresponding to the two 2Theta's. Or, we can change the input change file to remove them from group. You can find my latest input json file at
/SNS/users/y8z/NOM_Shared/Yuanpeng/texture_correction_Mar2025/Fe45Cr_ZYP/ttheta_group_params_new_new_new_new_new_new_new.json
-
Though we implemented the background subtraction before the second stage correction, still sometimes the patterns are not perfectly suitable for the integration purpose. In some regions, the pattern is either above the baseline by a bit offset or part of the peak is below the baseline. This makes the peak area calculation and therefore the spherical harmonics fitting inaccurate, yielding weirdly looking outcome. For example, if part of the peak is below the baseline, there are some negative intensities, potentially making the overall intensity close to 0. The fitted coefficient then could be very big, leading to a huge peak with part of the peak above and part of the peak below the baseline. To solve the issue, I did change the script a bit to offset the pattern after the background subtraction to make sure we don't have any negative intensity. The new script is located at
/SNS/users/y8z/NOM_Shared/Yuanpeng/texture_correction_Mar2025/Fe45Cr_ZYP/texture_proc_real_1step_not_aligned_2step_aligned.py
-
The original chunk definition is too coarse in some regions for both group-1 and group-2. You can find the chunk definition in my new json file mentioned above.
yuanpeng / texture_corr_steps.md
Dernière activité
Steps for Texture Correction
-
First, we need to prepare the grouping file, which will divide detectors into small groups according to the polar and azimuthal angles. The
MantidTotalScattering
(MTS) reduction will then take the grouping file for reducing data into those small groups.-
Go to
/SNS/NOM/shared/scripts/texture
and run thetexture_group_gen.py
script like this,mantidpython texture_group_gen.py
-
yuanpeng / output_group.json
Dernière activité
1 | { |
2 | "bank_1": [0, 35], |
3 | "bank_2": [35, 55], |
4 | "bank_3": [55, 75], |
5 | "bank_4": [75, 105], |
6 | "bank_5": [105, 135], |
7 | "bank_6": [135, 180] |
8 | } |
yuanpeng / texture_proc_real_1step_not_aligned_2step_aligned.py
Dernière activité
1 | import json |
2 | import numpy as np |
3 | import os |
4 | import scipy |
5 | from scipy.optimize import minimize |
6 | from scipy.optimize import curve_fit |
7 | from scipy.signal import argrelextrema |
8 | from pystog import Pre_Proc |
9 | import matplotlib.pyplot as plt |
10 | import random |
yuanpeng / ttheta_group_params_new_new_new.json
Dernière activité
1 | { |
2 | "Group-1": { |
3 | "QRange": { |
4 | "12": { |
5 | "LeftBound": 0.5, |
6 | "RightBound": 7.8 |
7 | }, |
8 | "17": { |
9 | "LeftBound": 0.75, |
10 | "RightBound": 7.8 |
yuanpeng / NOMAD_Invalid_groups_exported_1.csv
Dernière activité
1 | 2Theta,Group ID |
2 | 2,1 |
3 | 2,2 |
4 | 2,3 |
5 | 2,4 |
6 | 2,5 |
7 | 2,6 |
8 | 2,7 |
9 | 2,8 |
10 | 2,9 |
yuanpeng / remove_invalid_banks.py
Dernière activité
1 | import csv |
2 | import numpy as np |
3 | import os |
4 | |
5 | data = [] |
6 | |
7 | csv_file = 'NOMAD_Invalid_groups_exported_1.csv' |
8 | |
9 | with open(csv_file, 'r') as file: |
10 | csv_reader = csv.reader(file) |
yuanpeng / wksp2data.py
Dernière activité
1 | # import mantid algorithms, numpy and matplotlib |
2 | from mantid.simpleapi import * |
3 | import matplotlib.pyplot as plt |
4 | import numpy as np |
5 | import os |
6 | from pathlib import Path |
7 | |
8 | nxs_file = "./SofQ/NOM_Si_640e.nxs" |
9 | out_dir = "./texture_proc" |
yuanpeng / silicon.json
Dernière activité
1 | { |
2 | "Facility": "SNS", |
3 | "Instrument": "NOM", |
4 | "Title": "NOM_Si_640e", |
5 | "Sample": { |
6 | "Runs": "200047, 200048", |
7 | "Background": { |
8 | "Runs": "200044", |
9 | "Background": { |
10 | "Runs": "200046" |
yuanpeng / powgen_mts.json
Dernière activité
1 | { |
2 | "Facility": "SNS", |
3 | "Instrument": "PG3", |
4 | "Title": "pg3_test", |
5 | "Sample": { |
6 | "Runs": "53601", |
7 | "Background": { |
8 | "Runs": "51877", |
9 | "Background": { |
10 | "Runs": "51909" |