SlideShare una empresa de Scribd logo
1 de 56
Triangular Learner Model (TLM)
The 1st International Conference of TESOL
& Education (ICTE) and VLTESOL2022
Innovation in E-learning and Emerging Issues in
Teaching Foreign Languages in Post-Covid Era
Date: 22nd January 2022 on Microsoft Teams
Place: Van Lang University, Ho Chi Minh city, Vietnam
Presenter: Dr. Loc Nguyen, PhD
Email: ng_phloc@yahoo.com
Homepage: www.locnguyen.net
22/01/2022 TLM - Core of PhD research 1
Triangular Leaner Model
User model is description of users’ information and characteristics in
abstract level. User model is very important to adaptive software
which aims to support user as much as possible. The process to
construct user model is called user modeling. Within learning context
where users are learners, the research proposes a so-called
Triangular Learner Model (TLM) which is composed of three
essential learners’ properties such as knowledge, learning style, and
learning history. TLM is the user model that supports built-in
inference mechanism. So, the strong point of TLM is to reason out
new information from users, based on mathematical tools. This paper
focuses on fundamental algorithms and mathematical tools to
construct three basic components of TLM such as knowledge sub-
model, learning style sub-model, and learning history sub-model. In
general, the paper is a summary of results from research on TLM.
Algorithms and formulas are described by the succinct way.
22/01/2022 TLM - Core of PhD research 2
Triangular Learner Model
I. Triangular Learner Model (TLM)
II. Zebra: a user modeling system for TLM
III. Knowledge sub-model in TLM
IV. Learning style sub-model in TLM
V. Learning history sub-model in TLM
Original PhD. Report: Triangular Learner Model
(November 17 2009)
Advisor: Prof. Dr. Đồng Thị Bích Thủy
Researcher: Nguyễn Phước Lộc
Affiliation: Department of IS, Faculty of IT, University of
Science
22/01/2022 TLM - Core of PhD research 3
I. Triangular Leaner Model
Adaptive System
Selection Rules
User Modeling System
User Model
TARGET: Adaptive System
changes its action to provide
learning materials for every
student in accordance with her/his
model
Learning Materials
22/01/2022 TLM - Core of PhD research 4
I. Triangular Leaner Model
• Too much information about individuals to
model all users’ characteristics → it is
necessary to choose essential
characteristics from which a stable
architecture of user model is built.
• Some user modeling systems (UMS) lack
of powerful inference mechanism → need
a solid inference UMS
Hazards of User Modeling
22/01/2022 TLM - Core of PhD research 5
I. Triangular Leaner Model (TLM)
Triangular
Learner
Model
(TLM)
• Knowledge (K) sub-model is the combination of overlay model and Bayesian network
• Learning style (LS) sub-model is defined as the composite of characteristic cognitive,
affective and psychological factors
• Learning history (LH) is defined as a transcript of all learners’ actions such as learning
materials access, duration of computer use, doing exercise, taking an examination, doing
test, communicating with teachers or classmates, etc
22/01/2022 TLM - Core of PhD research 6
I. Triangular Leaner Model
• Knowledge, learning styles and learning history are
prerequisite for modeling learner
• While learning history changes themselves frequently,
learning styles and knowledge are relatively stable.
The combination of them ensures the integrity of
information about learner
• User knowledge is domain specific information and
learning styles are personal traits. The combination of
them supports user modeling system to take full
advantages of both domain specific information and
domain independent information
Why TLM?
22/01/2022 TLM - Core of PhD research 7
I. Triangular Leaner Model
extended Triangular Leaner Model
22/01/2022 TLM - Core of PhD research 8
I. Triangular Leaner Model
• How to build up TLM?
• How to manipulate (manage) TLM?
• How to infer new information from TLM?
→ Zebra: the user modeling system for TLM
22/01/2022 TLM - Core of PhD research 9
II. Zebra: a user modeling system for TLM
• Mining Engine (ME) manages
learning history sub-model of
TLM
• Belief Network Engine (BNE)
manages knowledge sub-
model and learning style sub-
model of TLM
• Communication Interfaces
(CI) allows users and adaptive
systems to see or modify
restrictedly TLM
22/01/2022 TLM - Core of PhD research 10
II. Zebra: a user modeling system for TLM
• Collecting learners’ data, monitoring their
actions, structuring and updating TLM.
• Providing important information to belief
network engine
• Supporting learning concept recommendation
• Discovering some other characteristics
(beyond knowledge and learning styles) such
as interests, goals, etc
• Supporting collaborative learning through
constructing learner groups (communities)
Mining Engine
22/01/2022 TLM - Core of PhD research 11
II. Zebra: a user modeling system for TLM
• Inferring new personal traits from TLM by
using deduction mechanism available in
belief network
• This engine applies Bayesian network
and hidden Markov model into inference
mechanism.
• Two sub-models: knowledge & learning
style are managed by this engine
Belief Network Engine
22/01/2022 TLM - Core of PhD research 12
II. Zebra: a user modeling system for TLM
The extended
architecture of
Zebra when
interacting with AES
22/01/2022 TLM - Core of PhD research 13
III. Knowledge sub-model
Knowledge sub-model = overlay model + Bayesian network (BN)
22/01/2022 TLM - Core of PhD research 14
III. Knowledge sub-model




n
i
i
i
n
h
w
Y
Y
Y
X
1
*
)
,...,
,
|
1
Pr( 2
1
Determining CPT (s) is based on weights of arcs


 

otherwise
X
if Y
h i
i
0
1
22/01/2022 TLM - Core of PhD research 15
III. Knowledge sub-model
T1
C O I p(J = 1)
P(J = 0)
1- p(J = 1)
1 1 1 1.0 (0.1*1 + 0.5*1 + 0.4*1) 0.0
1 1 0 0.6 (0.1*1 + 0.5*1 + 0.4*0) 0.4
1 0 1 0.5 (0.1*1 + 0.5*0 + 0.4*1) 0.5
1 0 0 0.1 (0.1*1 + 0.5*0 + 0.4*0) 0.9
0 1 1 0.9 (0.1*0 + 0.5*1 + 0.4*1) 0.1
0 1 0 0.5 (0.1*0 + 0.5*1 + 0.4*0) 0.5
0 0 1 0.4 (0.1*0 + 0.5*0 + 0.4*1) 0.4
0 0 0 0.0 (0.1*0 + 0.5*0 + 0.4*0) 1.0
T2
E Pr(E = 1)
Pr(E = 0)
1- Pr(E = 1)
1 0.8 (0.8*1) 0.2
0 0.0 (0.8*0) 1.0




n
i
i
i
n
h
w
Y
Y
Y
X
1
*
)
,...,
,
|
1
Pr( 2
1


 

otherwise
X
if Y
h i
i
0
1
T3
Q Pr(Q = 1)
Pr(Q = 0)
1- Pr(Q = 1)
1 0.2 (0.2*1) 0.8
0 0.0 (0.2*0) 1.0
Determining CPT (s) is based on weights of arcs
22/01/2022 TLM - Core of PhD research 16
III. Knowledge sub-model
• Parameter Learning: using Expectation
Maximization (EM) algorithm or Maximum
Likelihood Estimation (MLE) algorithm.
Both of them are used for beta
distributions
• Structure Learning and monitoring:
using Dynamic Bayesian Network (DBN)
Improving knowledge sub-model
22/01/2022 TLM - Core of PhD research 17
III. Knowledge sub-model (EM)
Beta density function
22/01/2022 TLM - Core of PhD research 18
III. Knowledge sub-model (EM)
EM technique
22/01/2022 TLM - Core of PhD research 19
III. Knowledge sub-model (EM)
EM technique
22/01/2022 TLM - Core of PhD research 20
III. Knowledge sub-model (MLE)
• The essence of maximizing the likelihood
function is to find the peak of the curve of
LnL(θ).
• This can be done by setting the first-order partial
derivative of LnL(θ) with respect to each
parameter θi to 0 and solving this equation to
find parameter θi
  

  










n
i
n
i
n
i
b
i
a
i
n
b
i
a
i
n
i
i x
x
b
a
B
x
x
b
a
B
b
a
x
f
L
1 1 1
1
1
1
1
1
)
1
(
)
,
(
1
)
1
(
)
,
(
1
)
,
,
(
)
(
MLE technique
22/01/2022 TLM - Core of PhD research 21
III. Knowledge sub-model (MLE)


























2
2
1
1
1
1
1
1
)
,
(
)
,
(
)
1
ln(
1
)
1
(
ln
1
)
1
(
L
b
a
F
L
b
a
F
x
n
C
k
e
x
n
C
k
e
n
i
i
a
k
k
a
k
b
n
i
i
b
k
k
b
k
a
The equations whose solutions are
parameter estimators
22/01/2022 TLM - Core of PhD research 22
III. Knowledge sub-model (MLE)
Iterative Algorithm for MLE
22/01/2022 TLM - Core of PhD research 23
III. Knowledge sub-model (DBN)
• An initial BN G0 = {X[0], Pr(X[0]} at first time t = 0
• A transition BN is a template consisting of a transition DAG G→
containing variables in X[t], X[t+1] and a transition probability
distribution Pr→ (X[t+1] | X[t])
A DBN is BN containing variables that comprise T variable vectors X[t]
22/01/2022 TLM - Core of PhD research 24
III. Knowledge sub-model (DBN)
• DBN can model the temporal relationships among
variables. It can capture the dynamic aspect
• So DBN allows monitoring user’s process of
gaining knowledge and evaluating her/his
knowledge
• The size of DBN becomes numerous when the
process continues for a long time
• The number of transition dependencies among
points in time is too large to compute posterior
marginal probabilities
Strong points of DBN
Drawbacks of DBN
22/01/2022 TLM - Core of PhD research 25
III. Knowledge sub-model (DBN)
• To overcome these drawbacks, the new algorithm
that both the size of DBN and the number of
Conditional Probability Tables (CPT) in DBN are
kept intact when the process continues for a long
time
• To solves the problem of temporary slip and lucky
guess: “learner does (doesn’t) know a particular
subject but there is solid evidence convincing that
she/he doesn’t (does) understand it; this evidence
just reflects a temporary slip (or lucky guess)”.
Purposes of suggested algorithm to improve DBN
22/01/2022 TLM - Core of PhD research 26
III. Knowledge sub-model (DBN)
1. Initializing DBN
2. Specifying transition weights
3. Re-constructing DBN
4. Normalizing weights of dependencies
5. Re-defining CPT (s)
6. Probabilistic inference
The algorithm for DBN includes 6 steps that
repeated whenever evidences occur
22/01/2022 TLM - Core of PhD research 27
III. Knowledge sub-model (DBN)
1. Initializing DBN 2. Specifying transition weights 3. Re-constructing
4. Normalizing weights
5. Re-defining CPT(s)
6. Probabilistic inference
22/01/2022 TLM - Core of PhD research 28
IV. Learning style sub-model
• S={s1, s2,…, sn} is the finite set of states
• Ө={θ1, θ2,…, θm} is the set of observations
• A is the transition probability matrix in which aij is
the probability that, the process change the
current state si to next state sj
• B is the observation probability matrix. Let bi(k) be
the probability of observation θk when the second
stochastic process is in state si
• ∏ is the initial state distribution where πi
represents the probability that the stochastic
process begins in state si
Hidden Markov Model (HMM) is the 5-tuple Δ=<S,θ,A,B,Π>
22/01/2022 TLM - Core of PhD research 29
IV. Learning style sub-model
Weather forecast example
22/01/2022 TLM - Core of PhD research 30
IV. Learning style sub-model
• Given HMM and a sequence of
observations O = {o1 → o2 →…→ ok}, how
to find the sequence of states U = {sk → sk+1
→…→ sk+m} so that U is most likely to have
produced the observation sequence O
• This is the uncovering problem: which
sequence of state transitions is most likely
to have led to this sequence of observations
→ Viterbi algorithm
Uncovering problem
22/01/2022 TLM - Core of PhD research 31
• Each learning style is now considered as
a state
• Users’ learning actions are considered
as observations
• After monitoring users’ learning process,
we collect observations about them and
then discover their styles by using
inference mechanism in HMM, namely
Viterbi algorithm
Basic idea
IV. Learning style sub-model
22/01/2022 TLM - Core of PhD research 32
• Suppose we choose Honey-Mumford model
and Felder-Silverman model as principal
models which are presented by HMM
• We have three dimensions: Verbal/Visual,
Activist/ Reflector, Theorist/ Pragmatist which
are modeled as three HMM(s): ∆1, ∆2, ∆3
respectively
∆1 = 〈 S1, Ө1, A1, B1, ∏ 1〉
∆2= 〈 S2, Ө2, A2, B2, ∏ 2〉.
∆3 = 〈 S3, Ө3, A3, B3, ∏ 3〉.
Basic idea
IV. Learning style sub-model
22/01/2022 TLM - Core of PhD research 33
1. Defining states (S1, S2, S3)
2. Defining initial state distributions
(∏ 1, ∏ 2, ∏ 3 )
1. Defining transition probability matrices
(A1, A2, A3)
2. Defining observations (Ө1, Ө2, Ө1)
3. Defining observation probability matrices
(B1, B2, B3)
Technique includes 5 steps
IV. Learning style sub-model
22/01/2022 TLM - Core of PhD research 34
IV. Learning style sub-model
22/01/2022 TLM - Core of PhD research 35
Learning
objects
selected
Sequence of state transitions → this student is a verbal, reflective and theoretical person.
Sequence of student observations
An example for inferring student’s learning styles
IV. Learning style sub-model
22/01/2022 TLM - Core of PhD research 36
V. Learning history sub-model
1. Providing necessary information for two
remaining sub-models: learning style sub-
model and knowledge sub-model
2. Supporting learning concept recommendation
3. Mining learners’ educational data in order to
discover other learners’ characteristics such
as interests, background, goals…
4. Supporting collaborative learning through
constructing learner groups.
Learning history managed by Mining Engine has
four responsibilities
22/01/2022 TLM - Core of PhD research 37
• Rule-based filtering: manually or automatically
generated decision rules that are used to
recommend items to users
• Content-based filtering: recommends items
that are considered appropriate to user
information in his profile
• Collaborative filtering: considered as social
filtering when it matches the rating of a current
user for items with those of similar users in
order to produce recommendations for new
items
Recommendation methods
V. Learning history sub-model (recommendation)
22/01/2022 TLM - Core of PhD research 38
• Sequential pattern mining belongs to
collaborative filtering family
• User does not rate explicitly items but his
series of chosen items are recorded as
sequences to construct the sequence
database which mined to find frequently
repeated patterns he can choose in future
• In learning context, items can be domain
concepts / learning objects which students
access or learn
Sequential pattern
V. Learning history sub-model (recommendation)
22/01/2022 TLM - Core of PhD research 39
• Suppose concepts in Java course: data type, package, class & OOP,
selection structure, virtual machine, loop structure, control structure, and
interface which in turn denoted as d, p, o, s, v, l, c, f
• At our e-learning website, students access learning material relating such
concepts in sessions, each session contains only one itemset and is
ordered by time. The student's learning sequence is constituted of
itemsets accessed in all his sessions
Given problem
V. Learning history sub-model (recommendation)
22/01/2022 TLM - Core of PhD research 40
1. Applying techniques of mining user
learning data to find learning sequential
patterns (not discussed here)
2. Breaking such patterns into concepts
which are recommended to users
Students accessed learning material in their past sessions, how
system recommends appropriate concepts to student for next
visits → mining sequential patterns → solution:
V. Learning history sub-model (recommendation)
22/01/2022 TLM - Core of PhD research 41
• Suppose the sequential pattern 〈osc(sc)〉
discovered which means:
“class & OOP” → “selection structure” → ”control structure” → ”selection
structure, control structure”
• Pattern is considered as the learning "route"
that student preferred or learned often in past
• In the next time if a student chooses one
concept, the adaptive learning system should
recommend which next concepts?
→ the patterns should be broken into
association rules with their confidence
Problem
V. Learning history sub-model (recommendation)
22/01/2022 TLM - Core of PhD research 42
1. Breaking entire 〈osc(sc)〉 into litemsets such as o, s, c, (sc)
and determining all possible large 2-sequences whose
order must comply with the order of sequential pattern.
There are six large 2-sequences: 〈os〉, 〈oc〉, 〈o(sc)〉, 〈sc〉,
〈s(sc)〉, 〈c(sc)〉.
2. Thus, we have six rules derived from these large 2-
sequences in form: “left-hand litemset → right-hand
litemset”, for example, rule “s→c” derived from 2-sequence
〈sc〉
3. Computing the confidences of rules and sorting them,
confidence(x → y) = support(〈xy〉) / support((x)). The rules
whose confidence is less than threshold min_conf is
removed
Breaking technique
V. Learning history sub-model (recommendation)
22/01/2022 TLM - Core of PhD research 43
• If student choose the concept (itemset) x,
system will find whole rules broken from all
sequential patterns and the left-hand litemset
of each rule must contain x
• Then, these rules are sorted by their
confidences in descending order
• Final outcome is an ordered list of right-hand
litemsets (concepts), which are recommended
to students
Recommended list
if user choose concept “class & oop”
Recommended List
V. Learning history sub-model (recommendation)
22/01/2022 TLM - Core of PhD research 44
• The series of user access in his/her
history are modeled as documents. So
user is referred indirectly to as
“document”.
• User interests are classes such
documents are belong to
There are two new points of view
V. Learning history sub-model (user interest)
22/01/2022 TLM - Core of PhD research 45
1. Documents in training corpus are represented according to
vector model. Each element of vector is product of term
frequency and inverse document frequency. However the
inverse document frequency can be removed from each
element for convenience
2. Classifying training corpus by applying decision tree or
support vector machine or neural network.
3. Mining user’s access history to find maximum frequent
itemsets. Each itemset is considered a interesting
document and its member items are considered as terms.
Such interesting documents are modeled as vectors
4. Applying classifiers (see step 3) into these interesting
documents in order to choose which classes are most
suitable to these interesting documents. Such classes are
user interests
Our approach includes four following steps
V. Learning history sub-model (user interest)
22/01/2022 TLM - Core of PhD research 46
• Suppose in some library or website, user U do his search
for his interesting books, documents
• There is demand of discovering his interests so that such
library or website can provide adaptive documents to him
whenever he visits in the next time
• Given there is a set of key words or terms {computer,
programming language, algorithm, derivative} that user U
often looking for, the his searching history is showed in
following table:
User searching history
V. Learning history sub-model (user interest)
22/01/2022 TLM - Core of PhD research 47
Using SVM, ANN or Decision Tree to classify this vector
This vector belongs to class compute science → user interest is computer science
V. Learning history sub-model (user interest)
22/01/2022 TLM - Core of PhD research 48
V. Learning history sub-model (clustering)
• Individual adaptation regards to each
user
• Community (or group) adaptation
focuses on a community (or group) of
users
There are two kinds of adaptations
22/01/2022 TLM - Core of PhD research 49
V. Learning history sub-model (clustering)
• Common features in a group are
relatively stable, so it is easy for
adaptive systems to perform
accurately adaptive tasks
• If a new user logins system,
she/he will be classified into a
group and initial information of his
model is assigned to common
features in his group
• It is very useful if the collaborative
learning is restricted in a group of
similar users
The problem that needs to be solved now is
to cluster user models because a group is a
cluster of similar user models.
22/01/2022 TLM - Core of PhD research 50
V. Learning history sub-model (clustering)
Clustering in case that user model is represented
as a vector: Ui = {ui1, ui2,…, uij,…, uin}
The dissimilarity of two user models is defined as Euclidean distance between them
K-means algorithm
2
2
1
2
22
12
2
21
11
2
1
2
1 )
(
...
)
(
)
(
)
,
(
tan
)
,
( n
n u
u
u
u
u
u
U
U
ce
dis
U
U
dissim 







22/01/2022 TLM - Core of PhD research 51
V. Learning history sub-model (clustering)
Clustering in case that user model is a overlay model (graph)
The dissimilarity of two graph models 




n
j
j
j
j
v
depth
v
v
G
G
ce
dis
G
G
dissim
1
1
2
1
2
1
2
1
)
(
)
,
(
tan
)
,
(
22/01/2022 TLM - Core of PhD research 52
V. Learning history sub-model (clustering)
Clustering in case that user model is a weighted graph
The dissimilarity of two graphs 




n
j
j
j
j
j
v
weight
v
depth
v
v
G
G
ce
dis
G
G
dissim
1
1
1
2
1
2
1
2
1 )
(
*
)
(
)
,
(
tan
)
,
(
22/01/2022 TLM - Core of PhD research 53
V. Learning history sub-model (clustering)
Clustering in case that user model is Bayesian network
The dissimilarity of two graph models 




n
j
j
j
j
v
depth
v
v
G
G
ce
dis
G
G
dissim
1
1
2
1
2
1
2
1
)
(
)
Pr(
)
Pr(
)
,
(
tan
)
,
(
22/01/2022 TLM - Core of PhD research 54
V. Learning history sub-model (clustering)
• Cosine similarity measure
• Correlation coefficient










n
k
jk
n
k
ik
n
k
jk
ik
j
i
j
i
j
i
j
i
u
u
u
u
U
U
U
U
U
U
U
U
sim
1
2
1
2
1
*
|
|
.
|
|
)
,
cos(
)
,
(












n
j
j
jk
n
k
i
ik
n
k
j
jk
i
ik
j
i
j
i
U
u
U
u
U
u
U
u
U
U
correl
U
U
sim
1
2
1
2
1
)
(
)
(
)
)(
(
)
,
(
)
,
(
K-medoids algorithm and two similarity measures
22/01/2022 TLM - Core of PhD research 55
TLM - Core of PhD research 56
THANK FOR CONSIDERATION
22/01/2022

Más contenido relacionado

La actualidad más candente

5212303961620480 1585670953 joanna_stachera_proposal_g_soc2020
5212303961620480 1585670953 joanna_stachera_proposal_g_soc20205212303961620480 1585670953 joanna_stachera_proposal_g_soc2020
5212303961620480 1585670953 joanna_stachera_proposal_g_soc2020JoannaStachera1
 
Adaptive network based fuzzy
Adaptive network based fuzzyAdaptive network based fuzzy
Adaptive network based fuzzyijaia
 
Master defence 2020 - Oleh Misko - Ensembling and Transfer Learning for Multi...
Master defence 2020 - Oleh Misko - Ensembling and Transfer Learning for Multi...Master defence 2020 - Oleh Misko - Ensembling and Transfer Learning for Multi...
Master defence 2020 - Oleh Misko - Ensembling and Transfer Learning for Multi...Lviv Data Science Summer School
 
A Time Series ANN Approach for Weather Forecasting
A Time Series ANN Approach for Weather ForecastingA Time Series ANN Approach for Weather Forecasting
A Time Series ANN Approach for Weather Forecastingijctcm
 
Unsupervised Clustering Classify theCancer Data withtheHelp of FCM Algorithm
Unsupervised Clustering Classify theCancer Data withtheHelp of FCM AlgorithmUnsupervised Clustering Classify theCancer Data withtheHelp of FCM Algorithm
Unsupervised Clustering Classify theCancer Data withtheHelp of FCM AlgorithmIOSR Journals
 
An ann approach for network
An ann approach for networkAn ann approach for network
An ann approach for networkIJNSA Journal
 
A Novel GA-SVM Model For Vehicles And Pedestrial Classification In Videos
A Novel GA-SVM Model For Vehicles And Pedestrial Classification In VideosA Novel GA-SVM Model For Vehicles And Pedestrial Classification In Videos
A Novel GA-SVM Model For Vehicles And Pedestrial Classification In Videosijtsrd
 
Pattern recognition using context dependent memory model (cdmm) in multimodal...
Pattern recognition using context dependent memory model (cdmm) in multimodal...Pattern recognition using context dependent memory model (cdmm) in multimodal...
Pattern recognition using context dependent memory model (cdmm) in multimodal...ijfcstjournal
 
Architecture neural network deep optimizing based on self organizing feature ...
Architecture neural network deep optimizing based on self organizing feature ...Architecture neural network deep optimizing based on self organizing feature ...
Architecture neural network deep optimizing based on self organizing feature ...journalBEEI
 
IRJET - Clustering Algorithm for Brain Image Segmentation
IRJET - Clustering Algorithm for Brain Image SegmentationIRJET - Clustering Algorithm for Brain Image Segmentation
IRJET - Clustering Algorithm for Brain Image SegmentationIRJET Journal
 
ROBUST TEXT DETECTION AND EXTRACTION IN NATURAL SCENE IMAGES USING CONDITIONA...
ROBUST TEXT DETECTION AND EXTRACTION IN NATURAL SCENE IMAGES USING CONDITIONA...ROBUST TEXT DETECTION AND EXTRACTION IN NATURAL SCENE IMAGES USING CONDITIONA...
ROBUST TEXT DETECTION AND EXTRACTION IN NATURAL SCENE IMAGES USING CONDITIONA...ijiert bestjournal
 
A Review of Image Classification Techniques
A Review of Image Classification TechniquesA Review of Image Classification Techniques
A Review of Image Classification TechniquesIRJET Journal
 

La actualidad más candente (18)

Topology Note
Topology NoteTopology Note
Topology Note
 
5212303961620480 1585670953 joanna_stachera_proposal_g_soc2020
5212303961620480 1585670953 joanna_stachera_proposal_g_soc20205212303961620480 1585670953 joanna_stachera_proposal_g_soc2020
5212303961620480 1585670953 joanna_stachera_proposal_g_soc2020
 
Adaptive network based fuzzy
Adaptive network based fuzzyAdaptive network based fuzzy
Adaptive network based fuzzy
 
Topology
TopologyTopology
Topology
 
Master defence 2020 - Oleh Misko - Ensembling and Transfer Learning for Multi...
Master defence 2020 - Oleh Misko - Ensembling and Transfer Learning for Multi...Master defence 2020 - Oleh Misko - Ensembling and Transfer Learning for Multi...
Master defence 2020 - Oleh Misko - Ensembling and Transfer Learning for Multi...
 
A Time Series ANN Approach for Weather Forecasting
A Time Series ANN Approach for Weather ForecastingA Time Series ANN Approach for Weather Forecasting
A Time Series ANN Approach for Weather Forecasting
 
Unsupervised Clustering Classify theCancer Data withtheHelp of FCM Algorithm
Unsupervised Clustering Classify theCancer Data withtheHelp of FCM AlgorithmUnsupervised Clustering Classify theCancer Data withtheHelp of FCM Algorithm
Unsupervised Clustering Classify theCancer Data withtheHelp of FCM Algorithm
 
Topology
TopologyTopology
Topology
 
An ann approach for network
An ann approach for networkAn ann approach for network
An ann approach for network
 
A Novel GA-SVM Model For Vehicles And Pedestrial Classification In Videos
A Novel GA-SVM Model For Vehicles And Pedestrial Classification In VideosA Novel GA-SVM Model For Vehicles And Pedestrial Classification In Videos
A Novel GA-SVM Model For Vehicles And Pedestrial Classification In Videos
 
Pattern recognition using context dependent memory model (cdmm) in multimodal...
Pattern recognition using context dependent memory model (cdmm) in multimodal...Pattern recognition using context dependent memory model (cdmm) in multimodal...
Pattern recognition using context dependent memory model (cdmm) in multimodal...
 
class 11
class 11class 11
class 11
 
Sunbelt 2013 Presentation
Sunbelt 2013 PresentationSunbelt 2013 Presentation
Sunbelt 2013 Presentation
 
Architecture neural network deep optimizing based on self organizing feature ...
Architecture neural network deep optimizing based on self organizing feature ...Architecture neural network deep optimizing based on self organizing feature ...
Architecture neural network deep optimizing based on self organizing feature ...
 
Network topology
Network topologyNetwork topology
Network topology
 
IRJET - Clustering Algorithm for Brain Image Segmentation
IRJET - Clustering Algorithm for Brain Image SegmentationIRJET - Clustering Algorithm for Brain Image Segmentation
IRJET - Clustering Algorithm for Brain Image Segmentation
 
ROBUST TEXT DETECTION AND EXTRACTION IN NATURAL SCENE IMAGES USING CONDITIONA...
ROBUST TEXT DETECTION AND EXTRACTION IN NATURAL SCENE IMAGES USING CONDITIONA...ROBUST TEXT DETECTION AND EXTRACTION IN NATURAL SCENE IMAGES USING CONDITIONA...
ROBUST TEXT DETECTION AND EXTRACTION IN NATURAL SCENE IMAGES USING CONDITIONA...
 
A Review of Image Classification Techniques
A Review of Image Classification TechniquesA Review of Image Classification Techniques
A Review of Image Classification Techniques
 

Similar a Triangular Learner Model

User modeling system demo at ICL December 06 2014
User modeling system demo at ICL December 06 2014User modeling system demo at ICL December 06 2014
User modeling system demo at ICL December 06 2014Loc Nguyen
 
Ensemble Methods for Collective Intelligence: Combining Ubiquitous ML Models ...
Ensemble Methods for Collective Intelligence: Combining Ubiquitous ML Models ...Ensemble Methods for Collective Intelligence: Combining Ubiquitous ML Models ...
Ensemble Methods for Collective Intelligence: Combining Ubiquitous ML Models ...Bharath Sudharsan
 
IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...
IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...
IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...Guilhermina Miranda
 
Machine learning with Big Data power point presentation
Machine learning with Big Data power point presentationMachine learning with Big Data power point presentation
Machine learning with Big Data power point presentationDavid Raj Kanthi
 
NS-CUK Seminar: J.H.Lee, Review on "Task Relation-aware Continual User Repres...
NS-CUK Seminar: J.H.Lee, Review on "Task Relation-aware Continual User Repres...NS-CUK Seminar: J.H.Lee, Review on "Task Relation-aware Continual User Repres...
NS-CUK Seminar: J.H.Lee, Review on "Task Relation-aware Continual User Repres...ssuser4b1f48
 
Current clustering techniques
Current clustering techniquesCurrent clustering techniques
Current clustering techniquesPoonam Kshirsagar
 
Machine Learning (ML) in Wireless Sensor Networks (WSNs)
Machine Learning (ML) in Wireless Sensor Networks (WSNs)Machine Learning (ML) in Wireless Sensor Networks (WSNs)
Machine Learning (ML) in Wireless Sensor Networks (WSNs)mabualsh
 
How can pre-training help to solve the cold start problem?
How can pre-training help to solve the cold start problem?How can pre-training help to solve the cold start problem?
How can pre-training help to solve the cold start problem?Lokesh Vadlamudi
 
VII Jornadas eMadrid "Education in exponential times". Mesa redonda eMadrid L...
VII Jornadas eMadrid "Education in exponential times". Mesa redonda eMadrid L...VII Jornadas eMadrid "Education in exponential times". Mesa redonda eMadrid L...
VII Jornadas eMadrid "Education in exponential times". Mesa redonda eMadrid L...eMadrid network
 
A SIMPLE PROCESS TO SPEED UP MACHINE LEARNING METHODS: APPLICATION TO HIDDEN ...
A SIMPLE PROCESS TO SPEED UP MACHINE LEARNING METHODS: APPLICATION TO HIDDEN ...A SIMPLE PROCESS TO SPEED UP MACHINE LEARNING METHODS: APPLICATION TO HIDDEN ...
A SIMPLE PROCESS TO SPEED UP MACHINE LEARNING METHODS: APPLICATION TO HIDDEN ...cscpconf
 
Final Year Project Report Example
Final Year Project Report ExampleFinal Year Project Report Example
Final Year Project Report ExampleMuhd Mu'izuddin
 
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...journal ijrtem
 
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...IJRTEMJOURNAL
 
[Cryptica 22] Deep Learning on Tabular Data, Predicting Profitability - Peiyu...
[Cryptica 22] Deep Learning on Tabular Data, Predicting Profitability - Peiyu...[Cryptica 22] Deep Learning on Tabular Data, Predicting Profitability - Peiyu...
[Cryptica 22] Deep Learning on Tabular Data, Predicting Profitability - Peiyu...DataScienceConferenc1
 
Model Evaluation in the land of Deep Learning
Model Evaluation in the land of Deep LearningModel Evaluation in the land of Deep Learning
Model Evaluation in the land of Deep LearningPramit Choudhary
 
Interactive Machine Learning
Interactive  Machine LearningInteractive  Machine Learning
Interactive Machine LearningZitao Liu
 
Extended pso algorithm for improvement problems k means clustering algorithm
Extended pso algorithm for improvement problems k means clustering algorithmExtended pso algorithm for improvement problems k means clustering algorithm
Extended pso algorithm for improvement problems k means clustering algorithmIJMIT JOURNAL
 
IntroML_1_Introduction_Tagged.pdf
IntroML_1_Introduction_Tagged.pdfIntroML_1_Introduction_Tagged.pdf
IntroML_1_Introduction_Tagged.pdfElio Laureano
 

Similar a Triangular Learner Model (20)

User modeling system demo at ICL December 06 2014
User modeling system demo at ICL December 06 2014User modeling system demo at ICL December 06 2014
User modeling system demo at ICL December 06 2014
 
Ensemble Methods for Collective Intelligence: Combining Ubiquitous ML Models ...
Ensemble Methods for Collective Intelligence: Combining Ubiquitous ML Models ...Ensemble Methods for Collective Intelligence: Combining Ubiquitous ML Models ...
Ensemble Methods for Collective Intelligence: Combining Ubiquitous ML Models ...
 
IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...
IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...
IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...
 
Machine learning with Big Data power point presentation
Machine learning with Big Data power point presentationMachine learning with Big Data power point presentation
Machine learning with Big Data power point presentation
 
NS-CUK Seminar: J.H.Lee, Review on "Task Relation-aware Continual User Repres...
NS-CUK Seminar: J.H.Lee, Review on "Task Relation-aware Continual User Repres...NS-CUK Seminar: J.H.Lee, Review on "Task Relation-aware Continual User Repres...
NS-CUK Seminar: J.H.Lee, Review on "Task Relation-aware Continual User Repres...
 
Current clustering techniques
Current clustering techniquesCurrent clustering techniques
Current clustering techniques
 
Machine Learning (ML) in Wireless Sensor Networks (WSNs)
Machine Learning (ML) in Wireless Sensor Networks (WSNs)Machine Learning (ML) in Wireless Sensor Networks (WSNs)
Machine Learning (ML) in Wireless Sensor Networks (WSNs)
 
How can pre-training help to solve the cold start problem?
How can pre-training help to solve the cold start problem?How can pre-training help to solve the cold start problem?
How can pre-training help to solve the cold start problem?
 
VII Jornadas eMadrid "Education in exponential times". Mesa redonda eMadrid L...
VII Jornadas eMadrid "Education in exponential times". Mesa redonda eMadrid L...VII Jornadas eMadrid "Education in exponential times". Mesa redonda eMadrid L...
VII Jornadas eMadrid "Education in exponential times". Mesa redonda eMadrid L...
 
A SIMPLE PROCESS TO SPEED UP MACHINE LEARNING METHODS: APPLICATION TO HIDDEN ...
A SIMPLE PROCESS TO SPEED UP MACHINE LEARNING METHODS: APPLICATION TO HIDDEN ...A SIMPLE PROCESS TO SPEED UP MACHINE LEARNING METHODS: APPLICATION TO HIDDEN ...
A SIMPLE PROCESS TO SPEED UP MACHINE LEARNING METHODS: APPLICATION TO HIDDEN ...
 
Final Year Project Report Example
Final Year Project Report ExampleFinal Year Project Report Example
Final Year Project Report Example
 
Unit 5
Unit 5Unit 5
Unit 5
 
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
 
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
 
[Cryptica 22] Deep Learning on Tabular Data, Predicting Profitability - Peiyu...
[Cryptica 22] Deep Learning on Tabular Data, Predicting Profitability - Peiyu...[Cryptica 22] Deep Learning on Tabular Data, Predicting Profitability - Peiyu...
[Cryptica 22] Deep Learning on Tabular Data, Predicting Profitability - Peiyu...
 
06522405
0652240506522405
06522405
 
Model Evaluation in the land of Deep Learning
Model Evaluation in the land of Deep LearningModel Evaluation in the land of Deep Learning
Model Evaluation in the land of Deep Learning
 
Interactive Machine Learning
Interactive  Machine LearningInteractive  Machine Learning
Interactive Machine Learning
 
Extended pso algorithm for improvement problems k means clustering algorithm
Extended pso algorithm for improvement problems k means clustering algorithmExtended pso algorithm for improvement problems k means clustering algorithm
Extended pso algorithm for improvement problems k means clustering algorithm
 
IntroML_1_Introduction_Tagged.pdf
IntroML_1_Introduction_Tagged.pdfIntroML_1_Introduction_Tagged.pdf
IntroML_1_Introduction_Tagged.pdf
 

Más de Loc Nguyen

Conditional mixture model and its application for regression model
Conditional mixture model and its application for regression modelConditional mixture model and its application for regression model
Conditional mixture model and its application for regression modelLoc Nguyen
 
Nghịch dân chủ luận (tổng quan về dân chủ và thể chế chính trị liên quan đến ...
Nghịch dân chủ luận (tổng quan về dân chủ và thể chế chính trị liên quan đến ...Nghịch dân chủ luận (tổng quan về dân chủ và thể chế chính trị liên quan đến ...
Nghịch dân chủ luận (tổng quan về dân chủ và thể chế chính trị liên quan đến ...Loc Nguyen
 
A Novel Collaborative Filtering Algorithm by Bit Mining Frequent Itemsets
A Novel Collaborative Filtering Algorithm by Bit Mining Frequent ItemsetsA Novel Collaborative Filtering Algorithm by Bit Mining Frequent Itemsets
A Novel Collaborative Filtering Algorithm by Bit Mining Frequent ItemsetsLoc Nguyen
 
Simple image deconvolution based on reverse image convolution and backpropaga...
Simple image deconvolution based on reverse image convolution and backpropaga...Simple image deconvolution based on reverse image convolution and backpropaga...
Simple image deconvolution based on reverse image convolution and backpropaga...Loc Nguyen
 
Technological Accessibility: Learning Platform Among Senior High School Students
Technological Accessibility: Learning Platform Among Senior High School StudentsTechnological Accessibility: Learning Platform Among Senior High School Students
Technological Accessibility: Learning Platform Among Senior High School StudentsLoc Nguyen
 
Engineering for Social Impact
Engineering for Social ImpactEngineering for Social Impact
Engineering for Social ImpactLoc Nguyen
 
Harnessing Technology for Research Education
Harnessing Technology for Research EducationHarnessing Technology for Research Education
Harnessing Technology for Research EducationLoc Nguyen
 
Future of education with support of technology
Future of education with support of technologyFuture of education with support of technology
Future of education with support of technologyLoc Nguyen
 
Where the dragon to fly
Where the dragon to flyWhere the dragon to fly
Where the dragon to flyLoc Nguyen
 
Adversarial Variational Autoencoders to extend and improve generative model
Adversarial Variational Autoencoders to extend and improve generative modelAdversarial Variational Autoencoders to extend and improve generative model
Adversarial Variational Autoencoders to extend and improve generative modelLoc Nguyen
 
Learning dyadic data and predicting unaccomplished co-occurrent values by mix...
Learning dyadic data and predicting unaccomplished co-occurrent values by mix...Learning dyadic data and predicting unaccomplished co-occurrent values by mix...
Learning dyadic data and predicting unaccomplished co-occurrent values by mix...Loc Nguyen
 
Tutorial on Bayesian optimization
Tutorial on Bayesian optimizationTutorial on Bayesian optimization
Tutorial on Bayesian optimizationLoc Nguyen
 
Tutorial on Support Vector Machine
Tutorial on Support Vector MachineTutorial on Support Vector Machine
Tutorial on Support Vector MachineLoc Nguyen
 
A Proposal of Two-step Autoregressive Model
A Proposal of Two-step Autoregressive ModelA Proposal of Two-step Autoregressive Model
A Proposal of Two-step Autoregressive ModelLoc Nguyen
 
Extreme bound analysis based on correlation coefficient for optimal regressio...
Extreme bound analysis based on correlation coefficient for optimal regressio...Extreme bound analysis based on correlation coefficient for optimal regressio...
Extreme bound analysis based on correlation coefficient for optimal regressio...Loc Nguyen
 
Jagged stock investment strategy
Jagged stock investment strategyJagged stock investment strategy
Jagged stock investment strategyLoc Nguyen
 
A short study on minima distribution
A short study on minima distributionA short study on minima distribution
A short study on minima distributionLoc Nguyen
 
Tutorial on EM algorithm – Part 4
Tutorial on EM algorithm – Part 4Tutorial on EM algorithm – Part 4
Tutorial on EM algorithm – Part 4Loc Nguyen
 
Tutorial on EM algorithm – Part 3
Tutorial on EM algorithm – Part 3Tutorial on EM algorithm – Part 3
Tutorial on EM algorithm – Part 3Loc Nguyen
 
Tutorial on particle swarm optimization
Tutorial on particle swarm optimizationTutorial on particle swarm optimization
Tutorial on particle swarm optimizationLoc Nguyen
 

Más de Loc Nguyen (20)

Conditional mixture model and its application for regression model
Conditional mixture model and its application for regression modelConditional mixture model and its application for regression model
Conditional mixture model and its application for regression model
 
Nghịch dân chủ luận (tổng quan về dân chủ và thể chế chính trị liên quan đến ...
Nghịch dân chủ luận (tổng quan về dân chủ và thể chế chính trị liên quan đến ...Nghịch dân chủ luận (tổng quan về dân chủ và thể chế chính trị liên quan đến ...
Nghịch dân chủ luận (tổng quan về dân chủ và thể chế chính trị liên quan đến ...
 
A Novel Collaborative Filtering Algorithm by Bit Mining Frequent Itemsets
A Novel Collaborative Filtering Algorithm by Bit Mining Frequent ItemsetsA Novel Collaborative Filtering Algorithm by Bit Mining Frequent Itemsets
A Novel Collaborative Filtering Algorithm by Bit Mining Frequent Itemsets
 
Simple image deconvolution based on reverse image convolution and backpropaga...
Simple image deconvolution based on reverse image convolution and backpropaga...Simple image deconvolution based on reverse image convolution and backpropaga...
Simple image deconvolution based on reverse image convolution and backpropaga...
 
Technological Accessibility: Learning Platform Among Senior High School Students
Technological Accessibility: Learning Platform Among Senior High School StudentsTechnological Accessibility: Learning Platform Among Senior High School Students
Technological Accessibility: Learning Platform Among Senior High School Students
 
Engineering for Social Impact
Engineering for Social ImpactEngineering for Social Impact
Engineering for Social Impact
 
Harnessing Technology for Research Education
Harnessing Technology for Research EducationHarnessing Technology for Research Education
Harnessing Technology for Research Education
 
Future of education with support of technology
Future of education with support of technologyFuture of education with support of technology
Future of education with support of technology
 
Where the dragon to fly
Where the dragon to flyWhere the dragon to fly
Where the dragon to fly
 
Adversarial Variational Autoencoders to extend and improve generative model
Adversarial Variational Autoencoders to extend and improve generative modelAdversarial Variational Autoencoders to extend and improve generative model
Adversarial Variational Autoencoders to extend and improve generative model
 
Learning dyadic data and predicting unaccomplished co-occurrent values by mix...
Learning dyadic data and predicting unaccomplished co-occurrent values by mix...Learning dyadic data and predicting unaccomplished co-occurrent values by mix...
Learning dyadic data and predicting unaccomplished co-occurrent values by mix...
 
Tutorial on Bayesian optimization
Tutorial on Bayesian optimizationTutorial on Bayesian optimization
Tutorial on Bayesian optimization
 
Tutorial on Support Vector Machine
Tutorial on Support Vector MachineTutorial on Support Vector Machine
Tutorial on Support Vector Machine
 
A Proposal of Two-step Autoregressive Model
A Proposal of Two-step Autoregressive ModelA Proposal of Two-step Autoregressive Model
A Proposal of Two-step Autoregressive Model
 
Extreme bound analysis based on correlation coefficient for optimal regressio...
Extreme bound analysis based on correlation coefficient for optimal regressio...Extreme bound analysis based on correlation coefficient for optimal regressio...
Extreme bound analysis based on correlation coefficient for optimal regressio...
 
Jagged stock investment strategy
Jagged stock investment strategyJagged stock investment strategy
Jagged stock investment strategy
 
A short study on minima distribution
A short study on minima distributionA short study on minima distribution
A short study on minima distribution
 
Tutorial on EM algorithm – Part 4
Tutorial on EM algorithm – Part 4Tutorial on EM algorithm – Part 4
Tutorial on EM algorithm – Part 4
 
Tutorial on EM algorithm – Part 3
Tutorial on EM algorithm – Part 3Tutorial on EM algorithm – Part 3
Tutorial on EM algorithm – Part 3
 
Tutorial on particle swarm optimization
Tutorial on particle swarm optimizationTutorial on particle swarm optimization
Tutorial on particle swarm optimization
 

Último

ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxAreebaZafar22
 
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptxOn_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptxPooja Bhuva
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Jisc
 
Google Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptxGoogle Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptxDr. Sarita Anand
 
Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)Jisc
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfPoh-Sun Goh
 
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...Nguyen Thanh Tu Collection
 
Plant propagation: Sexual and Asexual propapagation.pptx
Plant propagation: Sexual and Asexual propapagation.pptxPlant propagation: Sexual and Asexual propapagation.pptx
Plant propagation: Sexual and Asexual propapagation.pptxUmeshTimilsina1
 
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...Pooja Bhuva
 
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptxExploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptxPooja Bhuva
 
Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsKarakKing
 
Unit 3 Emotional Intelligence and Spiritual Intelligence.pdf
Unit 3 Emotional Intelligence and Spiritual Intelligence.pdfUnit 3 Emotional Intelligence and Spiritual Intelligence.pdf
Unit 3 Emotional Intelligence and Spiritual Intelligence.pdfDr Vijay Vishwakarma
 
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...Pooja Bhuva
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibitjbellavia9
 
How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17Celine George
 
Fostering Friendships - Enhancing Social Bonds in the Classroom
Fostering Friendships - Enhancing Social Bonds  in the ClassroomFostering Friendships - Enhancing Social Bonds  in the Classroom
Fostering Friendships - Enhancing Social Bonds in the ClassroomPooky Knightsmith
 
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdfUGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdfNirmal Dwivedi
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxRamakrishna Reddy Bijjam
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSCeline George
 
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptxCOMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptxannathomasp01
 

Último (20)

ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptxOn_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
Google Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptxGoogle Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptx
 
Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
 
Plant propagation: Sexual and Asexual propapagation.pptx
Plant propagation: Sexual and Asexual propapagation.pptxPlant propagation: Sexual and Asexual propapagation.pptx
Plant propagation: Sexual and Asexual propapagation.pptx
 
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
 
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptxExploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
 
Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
 
Unit 3 Emotional Intelligence and Spiritual Intelligence.pdf
Unit 3 Emotional Intelligence and Spiritual Intelligence.pdfUnit 3 Emotional Intelligence and Spiritual Intelligence.pdf
Unit 3 Emotional Intelligence and Spiritual Intelligence.pdf
 
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17
 
Fostering Friendships - Enhancing Social Bonds in the Classroom
Fostering Friendships - Enhancing Social Bonds  in the ClassroomFostering Friendships - Enhancing Social Bonds  in the Classroom
Fostering Friendships - Enhancing Social Bonds in the Classroom
 
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdfUGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POS
 
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptxCOMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
 

Triangular Learner Model

  • 1. Triangular Learner Model (TLM) The 1st International Conference of TESOL & Education (ICTE) and VLTESOL2022 Innovation in E-learning and Emerging Issues in Teaching Foreign Languages in Post-Covid Era Date: 22nd January 2022 on Microsoft Teams Place: Van Lang University, Ho Chi Minh city, Vietnam Presenter: Dr. Loc Nguyen, PhD Email: ng_phloc@yahoo.com Homepage: www.locnguyen.net 22/01/2022 TLM - Core of PhD research 1
  • 2. Triangular Leaner Model User model is description of users’ information and characteristics in abstract level. User model is very important to adaptive software which aims to support user as much as possible. The process to construct user model is called user modeling. Within learning context where users are learners, the research proposes a so-called Triangular Learner Model (TLM) which is composed of three essential learners’ properties such as knowledge, learning style, and learning history. TLM is the user model that supports built-in inference mechanism. So, the strong point of TLM is to reason out new information from users, based on mathematical tools. This paper focuses on fundamental algorithms and mathematical tools to construct three basic components of TLM such as knowledge sub- model, learning style sub-model, and learning history sub-model. In general, the paper is a summary of results from research on TLM. Algorithms and formulas are described by the succinct way. 22/01/2022 TLM - Core of PhD research 2
  • 3. Triangular Learner Model I. Triangular Learner Model (TLM) II. Zebra: a user modeling system for TLM III. Knowledge sub-model in TLM IV. Learning style sub-model in TLM V. Learning history sub-model in TLM Original PhD. Report: Triangular Learner Model (November 17 2009) Advisor: Prof. Dr. Đồng Thị Bích Thủy Researcher: Nguyễn Phước Lộc Affiliation: Department of IS, Faculty of IT, University of Science 22/01/2022 TLM - Core of PhD research 3
  • 4. I. Triangular Leaner Model Adaptive System Selection Rules User Modeling System User Model TARGET: Adaptive System changes its action to provide learning materials for every student in accordance with her/his model Learning Materials 22/01/2022 TLM - Core of PhD research 4
  • 5. I. Triangular Leaner Model • Too much information about individuals to model all users’ characteristics → it is necessary to choose essential characteristics from which a stable architecture of user model is built. • Some user modeling systems (UMS) lack of powerful inference mechanism → need a solid inference UMS Hazards of User Modeling 22/01/2022 TLM - Core of PhD research 5
  • 6. I. Triangular Leaner Model (TLM) Triangular Learner Model (TLM) • Knowledge (K) sub-model is the combination of overlay model and Bayesian network • Learning style (LS) sub-model is defined as the composite of characteristic cognitive, affective and psychological factors • Learning history (LH) is defined as a transcript of all learners’ actions such as learning materials access, duration of computer use, doing exercise, taking an examination, doing test, communicating with teachers or classmates, etc 22/01/2022 TLM - Core of PhD research 6
  • 7. I. Triangular Leaner Model • Knowledge, learning styles and learning history are prerequisite for modeling learner • While learning history changes themselves frequently, learning styles and knowledge are relatively stable. The combination of them ensures the integrity of information about learner • User knowledge is domain specific information and learning styles are personal traits. The combination of them supports user modeling system to take full advantages of both domain specific information and domain independent information Why TLM? 22/01/2022 TLM - Core of PhD research 7
  • 8. I. Triangular Leaner Model extended Triangular Leaner Model 22/01/2022 TLM - Core of PhD research 8
  • 9. I. Triangular Leaner Model • How to build up TLM? • How to manipulate (manage) TLM? • How to infer new information from TLM? → Zebra: the user modeling system for TLM 22/01/2022 TLM - Core of PhD research 9
  • 10. II. Zebra: a user modeling system for TLM • Mining Engine (ME) manages learning history sub-model of TLM • Belief Network Engine (BNE) manages knowledge sub- model and learning style sub- model of TLM • Communication Interfaces (CI) allows users and adaptive systems to see or modify restrictedly TLM 22/01/2022 TLM - Core of PhD research 10
  • 11. II. Zebra: a user modeling system for TLM • Collecting learners’ data, monitoring their actions, structuring and updating TLM. • Providing important information to belief network engine • Supporting learning concept recommendation • Discovering some other characteristics (beyond knowledge and learning styles) such as interests, goals, etc • Supporting collaborative learning through constructing learner groups (communities) Mining Engine 22/01/2022 TLM - Core of PhD research 11
  • 12. II. Zebra: a user modeling system for TLM • Inferring new personal traits from TLM by using deduction mechanism available in belief network • This engine applies Bayesian network and hidden Markov model into inference mechanism. • Two sub-models: knowledge & learning style are managed by this engine Belief Network Engine 22/01/2022 TLM - Core of PhD research 12
  • 13. II. Zebra: a user modeling system for TLM The extended architecture of Zebra when interacting with AES 22/01/2022 TLM - Core of PhD research 13
  • 14. III. Knowledge sub-model Knowledge sub-model = overlay model + Bayesian network (BN) 22/01/2022 TLM - Core of PhD research 14
  • 15. III. Knowledge sub-model     n i i i n h w Y Y Y X 1 * ) ,..., , | 1 Pr( 2 1 Determining CPT (s) is based on weights of arcs      otherwise X if Y h i i 0 1 22/01/2022 TLM - Core of PhD research 15
  • 16. III. Knowledge sub-model T1 C O I p(J = 1) P(J = 0) 1- p(J = 1) 1 1 1 1.0 (0.1*1 + 0.5*1 + 0.4*1) 0.0 1 1 0 0.6 (0.1*1 + 0.5*1 + 0.4*0) 0.4 1 0 1 0.5 (0.1*1 + 0.5*0 + 0.4*1) 0.5 1 0 0 0.1 (0.1*1 + 0.5*0 + 0.4*0) 0.9 0 1 1 0.9 (0.1*0 + 0.5*1 + 0.4*1) 0.1 0 1 0 0.5 (0.1*0 + 0.5*1 + 0.4*0) 0.5 0 0 1 0.4 (0.1*0 + 0.5*0 + 0.4*1) 0.4 0 0 0 0.0 (0.1*0 + 0.5*0 + 0.4*0) 1.0 T2 E Pr(E = 1) Pr(E = 0) 1- Pr(E = 1) 1 0.8 (0.8*1) 0.2 0 0.0 (0.8*0) 1.0     n i i i n h w Y Y Y X 1 * ) ,..., , | 1 Pr( 2 1      otherwise X if Y h i i 0 1 T3 Q Pr(Q = 1) Pr(Q = 0) 1- Pr(Q = 1) 1 0.2 (0.2*1) 0.8 0 0.0 (0.2*0) 1.0 Determining CPT (s) is based on weights of arcs 22/01/2022 TLM - Core of PhD research 16
  • 17. III. Knowledge sub-model • Parameter Learning: using Expectation Maximization (EM) algorithm or Maximum Likelihood Estimation (MLE) algorithm. Both of them are used for beta distributions • Structure Learning and monitoring: using Dynamic Bayesian Network (DBN) Improving knowledge sub-model 22/01/2022 TLM - Core of PhD research 17
  • 18. III. Knowledge sub-model (EM) Beta density function 22/01/2022 TLM - Core of PhD research 18
  • 19. III. Knowledge sub-model (EM) EM technique 22/01/2022 TLM - Core of PhD research 19
  • 20. III. Knowledge sub-model (EM) EM technique 22/01/2022 TLM - Core of PhD research 20
  • 21. III. Knowledge sub-model (MLE) • The essence of maximizing the likelihood function is to find the peak of the curve of LnL(θ). • This can be done by setting the first-order partial derivative of LnL(θ) with respect to each parameter θi to 0 and solving this equation to find parameter θi                  n i n i n i b i a i n b i a i n i i x x b a B x x b a B b a x f L 1 1 1 1 1 1 1 1 ) 1 ( ) , ( 1 ) 1 ( ) , ( 1 ) , , ( ) ( MLE technique 22/01/2022 TLM - Core of PhD research 21
  • 22. III. Knowledge sub-model (MLE)                           2 2 1 1 1 1 1 1 ) , ( ) , ( ) 1 ln( 1 ) 1 ( ln 1 ) 1 ( L b a F L b a F x n C k e x n C k e n i i a k k a k b n i i b k k b k a The equations whose solutions are parameter estimators 22/01/2022 TLM - Core of PhD research 22
  • 23. III. Knowledge sub-model (MLE) Iterative Algorithm for MLE 22/01/2022 TLM - Core of PhD research 23
  • 24. III. Knowledge sub-model (DBN) • An initial BN G0 = {X[0], Pr(X[0]} at first time t = 0 • A transition BN is a template consisting of a transition DAG G→ containing variables in X[t], X[t+1] and a transition probability distribution Pr→ (X[t+1] | X[t]) A DBN is BN containing variables that comprise T variable vectors X[t] 22/01/2022 TLM - Core of PhD research 24
  • 25. III. Knowledge sub-model (DBN) • DBN can model the temporal relationships among variables. It can capture the dynamic aspect • So DBN allows monitoring user’s process of gaining knowledge and evaluating her/his knowledge • The size of DBN becomes numerous when the process continues for a long time • The number of transition dependencies among points in time is too large to compute posterior marginal probabilities Strong points of DBN Drawbacks of DBN 22/01/2022 TLM - Core of PhD research 25
  • 26. III. Knowledge sub-model (DBN) • To overcome these drawbacks, the new algorithm that both the size of DBN and the number of Conditional Probability Tables (CPT) in DBN are kept intact when the process continues for a long time • To solves the problem of temporary slip and lucky guess: “learner does (doesn’t) know a particular subject but there is solid evidence convincing that she/he doesn’t (does) understand it; this evidence just reflects a temporary slip (or lucky guess)”. Purposes of suggested algorithm to improve DBN 22/01/2022 TLM - Core of PhD research 26
  • 27. III. Knowledge sub-model (DBN) 1. Initializing DBN 2. Specifying transition weights 3. Re-constructing DBN 4. Normalizing weights of dependencies 5. Re-defining CPT (s) 6. Probabilistic inference The algorithm for DBN includes 6 steps that repeated whenever evidences occur 22/01/2022 TLM - Core of PhD research 27
  • 28. III. Knowledge sub-model (DBN) 1. Initializing DBN 2. Specifying transition weights 3. Re-constructing 4. Normalizing weights 5. Re-defining CPT(s) 6. Probabilistic inference 22/01/2022 TLM - Core of PhD research 28
  • 29. IV. Learning style sub-model • S={s1, s2,…, sn} is the finite set of states • Ө={θ1, θ2,…, θm} is the set of observations • A is the transition probability matrix in which aij is the probability that, the process change the current state si to next state sj • B is the observation probability matrix. Let bi(k) be the probability of observation θk when the second stochastic process is in state si • ∏ is the initial state distribution where πi represents the probability that the stochastic process begins in state si Hidden Markov Model (HMM) is the 5-tuple Δ=<S,θ,A,B,Π> 22/01/2022 TLM - Core of PhD research 29
  • 30. IV. Learning style sub-model Weather forecast example 22/01/2022 TLM - Core of PhD research 30
  • 31. IV. Learning style sub-model • Given HMM and a sequence of observations O = {o1 → o2 →…→ ok}, how to find the sequence of states U = {sk → sk+1 →…→ sk+m} so that U is most likely to have produced the observation sequence O • This is the uncovering problem: which sequence of state transitions is most likely to have led to this sequence of observations → Viterbi algorithm Uncovering problem 22/01/2022 TLM - Core of PhD research 31
  • 32. • Each learning style is now considered as a state • Users’ learning actions are considered as observations • After monitoring users’ learning process, we collect observations about them and then discover their styles by using inference mechanism in HMM, namely Viterbi algorithm Basic idea IV. Learning style sub-model 22/01/2022 TLM - Core of PhD research 32
  • 33. • Suppose we choose Honey-Mumford model and Felder-Silverman model as principal models which are presented by HMM • We have three dimensions: Verbal/Visual, Activist/ Reflector, Theorist/ Pragmatist which are modeled as three HMM(s): ∆1, ∆2, ∆3 respectively ∆1 = 〈 S1, Ө1, A1, B1, ∏ 1〉 ∆2= 〈 S2, Ө2, A2, B2, ∏ 2〉. ∆3 = 〈 S3, Ө3, A3, B3, ∏ 3〉. Basic idea IV. Learning style sub-model 22/01/2022 TLM - Core of PhD research 33
  • 34. 1. Defining states (S1, S2, S3) 2. Defining initial state distributions (∏ 1, ∏ 2, ∏ 3 ) 1. Defining transition probability matrices (A1, A2, A3) 2. Defining observations (Ө1, Ө2, Ө1) 3. Defining observation probability matrices (B1, B2, B3) Technique includes 5 steps IV. Learning style sub-model 22/01/2022 TLM - Core of PhD research 34
  • 35. IV. Learning style sub-model 22/01/2022 TLM - Core of PhD research 35
  • 36. Learning objects selected Sequence of state transitions → this student is a verbal, reflective and theoretical person. Sequence of student observations An example for inferring student’s learning styles IV. Learning style sub-model 22/01/2022 TLM - Core of PhD research 36
  • 37. V. Learning history sub-model 1. Providing necessary information for two remaining sub-models: learning style sub- model and knowledge sub-model 2. Supporting learning concept recommendation 3. Mining learners’ educational data in order to discover other learners’ characteristics such as interests, background, goals… 4. Supporting collaborative learning through constructing learner groups. Learning history managed by Mining Engine has four responsibilities 22/01/2022 TLM - Core of PhD research 37
  • 38. • Rule-based filtering: manually or automatically generated decision rules that are used to recommend items to users • Content-based filtering: recommends items that are considered appropriate to user information in his profile • Collaborative filtering: considered as social filtering when it matches the rating of a current user for items with those of similar users in order to produce recommendations for new items Recommendation methods V. Learning history sub-model (recommendation) 22/01/2022 TLM - Core of PhD research 38
  • 39. • Sequential pattern mining belongs to collaborative filtering family • User does not rate explicitly items but his series of chosen items are recorded as sequences to construct the sequence database which mined to find frequently repeated patterns he can choose in future • In learning context, items can be domain concepts / learning objects which students access or learn Sequential pattern V. Learning history sub-model (recommendation) 22/01/2022 TLM - Core of PhD research 39
  • 40. • Suppose concepts in Java course: data type, package, class & OOP, selection structure, virtual machine, loop structure, control structure, and interface which in turn denoted as d, p, o, s, v, l, c, f • At our e-learning website, students access learning material relating such concepts in sessions, each session contains only one itemset and is ordered by time. The student's learning sequence is constituted of itemsets accessed in all his sessions Given problem V. Learning history sub-model (recommendation) 22/01/2022 TLM - Core of PhD research 40
  • 41. 1. Applying techniques of mining user learning data to find learning sequential patterns (not discussed here) 2. Breaking such patterns into concepts which are recommended to users Students accessed learning material in their past sessions, how system recommends appropriate concepts to student for next visits → mining sequential patterns → solution: V. Learning history sub-model (recommendation) 22/01/2022 TLM - Core of PhD research 41
  • 42. • Suppose the sequential pattern 〈osc(sc)〉 discovered which means: “class & OOP” → “selection structure” → ”control structure” → ”selection structure, control structure” • Pattern is considered as the learning "route" that student preferred or learned often in past • In the next time if a student chooses one concept, the adaptive learning system should recommend which next concepts? → the patterns should be broken into association rules with their confidence Problem V. Learning history sub-model (recommendation) 22/01/2022 TLM - Core of PhD research 42
  • 43. 1. Breaking entire 〈osc(sc)〉 into litemsets such as o, s, c, (sc) and determining all possible large 2-sequences whose order must comply with the order of sequential pattern. There are six large 2-sequences: 〈os〉, 〈oc〉, 〈o(sc)〉, 〈sc〉, 〈s(sc)〉, 〈c(sc)〉. 2. Thus, we have six rules derived from these large 2- sequences in form: “left-hand litemset → right-hand litemset”, for example, rule “s→c” derived from 2-sequence 〈sc〉 3. Computing the confidences of rules and sorting them, confidence(x → y) = support(〈xy〉) / support((x)). The rules whose confidence is less than threshold min_conf is removed Breaking technique V. Learning history sub-model (recommendation) 22/01/2022 TLM - Core of PhD research 43
  • 44. • If student choose the concept (itemset) x, system will find whole rules broken from all sequential patterns and the left-hand litemset of each rule must contain x • Then, these rules are sorted by their confidences in descending order • Final outcome is an ordered list of right-hand litemsets (concepts), which are recommended to students Recommended list if user choose concept “class & oop” Recommended List V. Learning history sub-model (recommendation) 22/01/2022 TLM - Core of PhD research 44
  • 45. • The series of user access in his/her history are modeled as documents. So user is referred indirectly to as “document”. • User interests are classes such documents are belong to There are two new points of view V. Learning history sub-model (user interest) 22/01/2022 TLM - Core of PhD research 45
  • 46. 1. Documents in training corpus are represented according to vector model. Each element of vector is product of term frequency and inverse document frequency. However the inverse document frequency can be removed from each element for convenience 2. Classifying training corpus by applying decision tree or support vector machine or neural network. 3. Mining user’s access history to find maximum frequent itemsets. Each itemset is considered a interesting document and its member items are considered as terms. Such interesting documents are modeled as vectors 4. Applying classifiers (see step 3) into these interesting documents in order to choose which classes are most suitable to these interesting documents. Such classes are user interests Our approach includes four following steps V. Learning history sub-model (user interest) 22/01/2022 TLM - Core of PhD research 46
  • 47. • Suppose in some library or website, user U do his search for his interesting books, documents • There is demand of discovering his interests so that such library or website can provide adaptive documents to him whenever he visits in the next time • Given there is a set of key words or terms {computer, programming language, algorithm, derivative} that user U often looking for, the his searching history is showed in following table: User searching history V. Learning history sub-model (user interest) 22/01/2022 TLM - Core of PhD research 47
  • 48. Using SVM, ANN or Decision Tree to classify this vector This vector belongs to class compute science → user interest is computer science V. Learning history sub-model (user interest) 22/01/2022 TLM - Core of PhD research 48
  • 49. V. Learning history sub-model (clustering) • Individual adaptation regards to each user • Community (or group) adaptation focuses on a community (or group) of users There are two kinds of adaptations 22/01/2022 TLM - Core of PhD research 49
  • 50. V. Learning history sub-model (clustering) • Common features in a group are relatively stable, so it is easy for adaptive systems to perform accurately adaptive tasks • If a new user logins system, she/he will be classified into a group and initial information of his model is assigned to common features in his group • It is very useful if the collaborative learning is restricted in a group of similar users The problem that needs to be solved now is to cluster user models because a group is a cluster of similar user models. 22/01/2022 TLM - Core of PhD research 50
  • 51. V. Learning history sub-model (clustering) Clustering in case that user model is represented as a vector: Ui = {ui1, ui2,…, uij,…, uin} The dissimilarity of two user models is defined as Euclidean distance between them K-means algorithm 2 2 1 2 22 12 2 21 11 2 1 2 1 ) ( ... ) ( ) ( ) , ( tan ) , ( n n u u u u u u U U ce dis U U dissim         22/01/2022 TLM - Core of PhD research 51
  • 52. V. Learning history sub-model (clustering) Clustering in case that user model is a overlay model (graph) The dissimilarity of two graph models      n j j j j v depth v v G G ce dis G G dissim 1 1 2 1 2 1 2 1 ) ( ) , ( tan ) , ( 22/01/2022 TLM - Core of PhD research 52
  • 53. V. Learning history sub-model (clustering) Clustering in case that user model is a weighted graph The dissimilarity of two graphs      n j j j j j v weight v depth v v G G ce dis G G dissim 1 1 1 2 1 2 1 2 1 ) ( * ) ( ) , ( tan ) , ( 22/01/2022 TLM - Core of PhD research 53
  • 54. V. Learning history sub-model (clustering) Clustering in case that user model is Bayesian network The dissimilarity of two graph models      n j j j j v depth v v G G ce dis G G dissim 1 1 2 1 2 1 2 1 ) ( ) Pr( ) Pr( ) , ( tan ) , ( 22/01/2022 TLM - Core of PhD research 54
  • 55. V. Learning history sub-model (clustering) • Cosine similarity measure • Correlation coefficient           n k jk n k ik n k jk ik j i j i j i j i u u u u U U U U U U U U sim 1 2 1 2 1 * | | . | | ) , cos( ) , (             n j j jk n k i ik n k j jk i ik j i j i U u U u U u U u U U correl U U sim 1 2 1 2 1 ) ( ) ( ) )( ( ) , ( ) , ( K-medoids algorithm and two similarity measures 22/01/2022 TLM - Core of PhD research 55
  • 56. TLM - Core of PhD research 56 THANK FOR CONSIDERATION 22/01/2022