Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Submit feedback
Sign in
Toggle navigation
C
covid_analysis
Project overview
Project overview
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
COMPARA
covid_analysis
Commits
f919e066
Commit
f919e066
authored
Jun 07, 2024
by
Joaquin Torres
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Identified problem with features
parent
7b58b74c
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
6 additions
and
5 deletions
+6
-5
explicability/shap_vals.py
explicability/shap_vals.py
+6
-5
No files found.
explicability/shap_vals.py
View file @
f919e066
...
@@ -143,7 +143,7 @@ if __name__ == "__main__":
...
@@ -143,7 +143,7 @@ if __name__ == "__main__":
# Shap value generation
# Shap value generation
# --------------------------------------------------------------------------------------------------------
# --------------------------------------------------------------------------------------------------------
for
i
,
group
in
enumerate
([
'pre'
,
'post'
]):
for
i
,
group
in
enumerate
([
'pre'
,
'post'
]):
# Get test dataset based on group
# Get test dataset based on group
, add column names
X_test
=
pd
.
DataFrame
(
data_dic
[
'X_test_'
+
group
],
columns
=
attribute_names
)
X_test
=
pd
.
DataFrame
(
data_dic
[
'X_test_'
+
group
],
columns
=
attribute_names
)
y_test
=
data_dic
[
'y_test_'
+
group
]
y_test
=
data_dic
[
'y_test_'
+
group
]
for
j
,
method
in
enumerate
([
''
,
''
,
'over_'
,
'under_'
]):
for
j
,
method
in
enumerate
([
''
,
''
,
'over_'
,
'under_'
]):
...
@@ -157,14 +157,15 @@ if __name__ == "__main__":
...
@@ -157,14 +157,15 @@ if __name__ == "__main__":
# --------------------------------------------------------------------------------------------------------
# --------------------------------------------------------------------------------------------------------
# Fit model with training data
# Fit model with training data
fitted_model
=
model
.
fit
(
X_train
[:
500
],
y_train
[:
500
])
fitted_model
=
model
.
fit
(
X_train
[:
500
],
y_train
[:
500
])
# Check if we are dealing with a tree vs nn model
#
#
Check if we are dealing with a tree vs nn model
if
is_tree
:
if
is_tree
:
explainer
=
shap
.
TreeExplainer
(
fitted_model
,
X_test
[:
500
]
)
explainer
=
shap
.
TreeExplainer
(
fitted_model
)
else
:
#
else:
explainer
=
shap
.
KernelExplainer
(
fitted_model
.
predict_proba
,
X_test
[:
500
])
#
explainer = shap.KernelExplainer(fitted_model.predict_proba, X_test[:500])
# Compute shap values
# Compute shap values
shap_vals
=
explainer
.
shap_values
(
X_test
[:
500
],
check_additivity
=
False
)
# Change to true for final results
shap_vals
=
explainer
.
shap_values
(
X_test
[:
500
],
check_additivity
=
False
)
# Change to true for final results
# ---------------------------------------------------------------------------------------------------------
# ---------------------------------------------------------------------------------------------------------
# Save results
# Save results
np
.
save
(
f
"./output/shap_values/{group}_{method_names[j]}"
,
shap_vals
)
np
.
save
(
f
"./output/shap_values/{group}_{method_names[j]}"
,
shap_vals
)
print
(
f
'Shape of numpy array: {shap_vals.shape}'
)
# --------------------------------------------------------------------------------------------------------
# --------------------------------------------------------------------------------------------------------
\ No newline at end of file
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment