-
iris數據集:
5.1 3.5 1.4 0.2 Iris-setosa 4.9 3 1.4 0.2 Iris-setosa 4.7 3.2 1.3 0.2 Iris-setosa 4.6 3.1 1.5 0.2 Iris-setosa 5 3.6 1.4 0.2 Iris-setosa 5.4 3.9 1.7 0.4 Iris-setosa 4.6 3.4 1.4 0.3 Iris-setosa 5 3.4 1.5 0.2 Iris-setosa 4.4 2.9 1.4 0.2 Iris-setosa 4.9 3.1 1.5 0.1 Iris-setosa 5.4 3.7 1.5 0.2 Iris-setosa 4.8 3.4 1.6 0.2 Iris-setosa 4.8 3 1.4 0.1 Iris-setosa 4.3 3 1.1 0.1 Iris-setosa 5.8 4 1.2 0.2 Iris-setosa 5.7 4.4 1.5 0.4 Iris-setosa 5.4 3.9 1.3 0.4 Iris-setosa 5.1 3.5 1.4 0.3 Iris-setosa 5.7 3.8 1.7 0.3 Iris-setosa 5.1 3.8 1.5 0.3 Iris-setosa 5.4 3.4 1.7 0.2 Iris-setosa 5.1 3.7 1.5 0.4 Iris-setosa 4.6 3.6 1 0.2 Iris-setosa 5.1 3.3 1.7 0.5 Iris-setosa 4.8 3.4 1.9 0.2 Iris-setosa 5 3 1.6 0.2 Iris-setosa 5 3.4 1.6 0.4 Iris-setosa 5.2 3.5 1.5 0.2 Iris-setosa 5.2 3.4 1.4 0.2 Iris-setosa 4.7 3.2 1.6 0.2 Iris-setosa 4.8 3.1 1.6 0.2 Iris-setosa 5.4 3.4 1.5 0.4 Iris-setosa 5.2 4.1 1.5 0.1 Iris-setosa 5.5 4.2 1.4 0.2 Iris-setosa 4.9 3.1 1.5 0.1 Iris-setosa 5 3.2 1.2 0.2 Iris-setosa 5.5 3.5 1.3 0.2 Iris-setosa 4.9 3.1 1.5 0.1 Iris-setosa 4.4 3 1.3 0.2 Iris-setosa 5.1 3.4 1.5 0.2 Iris-setosa 5 3.5 1.3 0.3 Iris-setosa 4.5 2.3 1.3 0.3 Iris-setosa 4.4 3.2 1.3 0.2 Iris-setosa 5 3.5 1.6 0.6 Iris-setosa 5.1 3.8 1.9 0.4 Iris-setosa 4.8 3 1.4 0.3 Iris-setosa 5.1 3.8 1.6 0.2 Iris-setosa 4.6 3.2 1.4 0.2 Iris-setosa 5.3 3.7 1.5 0.2 Iris-setosa 5 3.3 1.4 0.2 Iris-setosa 7 3.2 4.7 1.4 Iris-versicolor 6.4 3.2 4.5 1.5 Iris-versicolor 6.9 3.1 4.9 1.5 Iris-versicolor 5.5 2.3 4 1.3 Iris-versicolor 6.5 2.8 4.6 1.5 Iris-versicolor 5.7 2.8 4.5 1.3 Iris-versicolor 6.3 3.3 4.7 1.6 Iris-versicolor 4.9 2.4 3.3 1 Iris-versicolor 6.6 2.9 4.6 1.3 Iris-versicolor 5.2 2.7 3.9 1.4 Iris-versicolor 5 2 3.5 1 Iris-versicolor 5.9 3 4.2 1.5 Iris-versicolor 6 2.2 4 1 Iris-versicolor 6.1 2.9 4.7 1.4 Iris-versicolor 5.6 2.9 3.6 1.3 Iris-versicolor 6.7 3.1 4.4 1.4 Iris-versicolor 5.6 3 4.5 1.5 Iris-versicolor 5.8 2.7 4.1 1 Iris-versicolor 6.2 2.2 4.5 1.5 Iris-versicolor 5.6 2.5 3.9 1.1 Iris-versicolor 5.9 3.2 4.8 1.8 Iris-versicolor 6.1 2.8 4 1.3 Iris-versicolor 6.3 2.5 4.9 1.5 Iris-versicolor 6.1 2.8 4.7 1.2 Iris-versicolor 6.4 2.9 4.3 1.3 Iris-versicolor 6.6 3 4.4 1.4 Iris-versicolor 6.8 2.8 4.8 1.4 Iris-versicolor 6.7 3 5 1.7 Iris-versicolor 6 2.9 4.5 1.5 Iris-versicolor 5.7 2.6 3.5 1 Iris-versicolor 5.5 2.4 3.8 1.1 Iris-versicolor 5.5 2.4 3.7 1 Iris-versicolor 5.8 2.7 3.9 1.2 Iris-versicolor 6 2.7 5.1 1.6 Iris-versicolor 5.4 3 4.5 1.5 Iris-versicolor 6 3.4 4.5 1.6 Iris-versicolor 6.7 3.1 4.7 1.5 Iris-versicolor 6.3 2.3 4.4 1.3 Iris-versicolor 5.6 3 4.1 1.3 Iris-versicolor 5.5 2.5 4 1.3 Iris-versicolor 5.5 2.6 4.4 1.2 Iris-versicolor 6.1 3 4.6 1.4 Iris-versicolor 5.8 2.6 4 1.2 Iris-versicolor 5 2.3 3.3 1 Iris-versicolor 5.6 2.7 4.2 1.3 Iris-versicolor 5.7 3 4.2 1.2 Iris-versicolor 5.7 2.9 4.2 1.3 Iris-versicolor 6.2 2.9 4.3 1.3 Iris-versicolor 5.1 2.5 3 1.1 Iris-versicolor 5.7 2.8 4.1 1.3 Iris-versicolor 6.3 3.3 6 2.5 Iris-virginica 5.8 2.7 5.1 1.9 Iris-virginica 7.1 3 5.9 2.1 Iris-virginica 6.3 2.9 5.6 1.8 Iris-virginica 6.5 3 5.8 2.2 Iris-virginica 7.6 3 6.6 2.1 Iris-virginica 4.9 2.5 4.5 1.7 Iris-virginica 7.3 2.9 6.3 1.8 Iris-virginica 6.7 2.5 5.8 1.8 Iris-virginica 7.2 3.6 6.1 2.5 Iris-virginica 6.5 3.2 5.1 2 Iris-virginica 6.4 2.7 5.3 1.9 Iris-virginica 6.8 3 5.5 2.1 Iris-virginica 5.7 2.5 5 2 Iris-virginica 5.8 2.8 5.1 2.4 Iris-virginica 6.4 3.2 5.3 2.3 Iris-virginica 6.5 3 5.5 1.8 Iris-virginica 7.7 3.8 6.7 2.2 Iris-virginica 7.7 2.6 6.9 2.3 Iris-virginica 6 2.2 5 1.5 Iris-virginica 6.9 3.2 5.7 2.3 Iris-virginica 5.6 2.8 4.9 2 Iris-virginica 7.7 2.8 6.7 2 Iris-virginica 6.3 2.7 4.9 1.8 Iris-virginica 6.7 3.3 5.7 2.1 Iris-virginica 7.2 3.2 6 1.8 Iris-virginica 6.2 2.8 4.8 1.8 Iris-virginica 6.1 3 4.9 1.8 Iris-virginica 6.4 2.8 5.6 2.1 Iris-virginica 7.2 3 5.8 1.6 Iris-virginica 7.4 2.8 6.1 1.9 Iris-virginica 7.9 3.8 6.4 2 Iris-virginica 6.4 2.8 5.6 2.2 Iris-virginica 6.3 2.8 5.1 1.5 Iris-virginica 6.1 2.6 5.6 1.4 Iris-virginica 7.7 3 6.1 2.3 Iris-virginica 6.3 3.4 5.6 2.4 Iris-virginica 6.4 3.1 5.5 1.8 Iris-virginica 6 3 4.8 1.8 Iris-virginica 6.9 3.1 5.4 2.1 Iris-virginica 6.7 3.1 5.6 2.4 Iris-virginica 6.9 3.1 5.1 2.3 Iris-virginica 5.8 2.7 5.1 1.9 Iris-virginica 6.8 3.2 5.9 2.3 Iris-virginica 6.7 3.3 5.7 2.5 Iris-virginica 6.7 3 5.2 2.3 Iris-virginica 6.3 2.5 5 1.9 Iris-virginica 6.5 3 5.2 2 Iris-virginica 6.2 3.4 5.4 2.3 Iris-virginica 5.9 3 5.1 1.8 Iris-virginica 查看全部 -
import?numpy?as?np class?Perceptron(object): ????"""???? ????eta:學習率???? ????n_iter:權重向量的訓練次數???? ????w_:神經分叉權重向量???? ????errors:用于記錄神經元判斷出錯次數???? ????""" ????def?__init__(self,?eta,?n_iter):???????? ????????self.eta=eta???????? ????????self.n_iter=n_iter???????? ????????pass ????def?net_input(self,X):???????????? ????????"""???????????? ????????z?=?W0*1?+?W1*X1?+....?Wn*Xn???????????? ????????"""???????????? ????????return?np.dot(X,self.w_[1:])?+?self.w_[0]???????????? ????????pass ???? ????def?predict(self,?X):???????? ????????return?np.where(self.net_input(X)?>=?0.0?,1,?-1)???????? ????????pass ????def?fit(self,X,y):???????? ????????"""???????? ????????輸入訓練數據,訓練神經元,x輸入樣本向量,y對應樣本的正確分類???????????????? ????????X:shape[n_samples,?n_features]???????? ????????eg:X:[[1,?2,?,3],[4,?5,?6]]??????????? ???????????n_samples:?2??????????? ???????????n_features:?3?????????????????????? ???????????y:[1,?-1]???????? ????????"""???????? ????????"""???????? ????????初始化權重向量為0???????? ????????加一是因為前面算法提到的w0,也就是步調函數閾值???????? ????????"""???????? ????????self.w_?=?np.zeros(1+X.shape[1])???????? ????????self.errors_?=[]???????????????? ???????? ????????for?_?in?range(self.n_iter):???????????? ????????????errors?=?0???????????? ????????????"""???????????? ????????????X:[[1,2,3],[4,5,6]]???????????? ????????????y:[1,?-1]???????????? ????????????zip(X,y)=[[1,2,3,?1],[4,5,6,?-1]]???????????? ????????????"""???????????? ????????????for?xi,?target?in?zip(X,y):????????????????? ????????????????"""???????????????? ????????????????update?=?學習率?*?(y-y')???????????????? ????????????????"""???????????????? ????????????????update?=?self.eta?*?(target?-?self.predict(xi))???????????????? ????????????????"""???????????????? ????????????????xi?是一個向量???????????????? ????????????????update?*?xi?等價:???????????????? ????????????????[更新w(1)?=?X[1]*update,?更新w(2)?=?X[2]*update,更新w(3)?=?X[3]*update,]???????????????? ????????????????"""???????????????? ????????????????self.w_[1:]+=update?*?xi???????????????? ????????????????self.w_[0]?+=?update???????????????????????????????? ????????????????errors?+=?int(update?!=?0.0)???????????????? ????????????????self.errors_.append(errors)???????????????? ????????????????pass???????????????????????? ????????????pass??? ????pass file?=?"D:/PyCharm_test_file/Jupyter_test/iris1.xlsx" import?pandas?as?pd df?=?pd.read_excel(file,header=None) #df.head(10) #print(df)?? import?matplotlib.pyplot?as?plt import?numpy?as?npy?=?df.loc[0:99,?4].values y?=?np.where(y?==?'Iris-setosa',?-1,?1) #print(y) X?=?df.iloc[0:100,?[0,?2]].values #print(X) plt.scatter(X[:50,?0],?X[:50,?1],?color='red',?marker='o',?label='setosa') plt.scatter(X[50:100,?0],?X[50:100,?1],?color='blue',?marker='x',?label='vericolor') plt.rcParams['font.sans-serif']=['SimHei'] plt.xlabel('花瓣長度') plt.ylabel('花莖長度') plt.legend(loc='upper?left') plt.show()?? from?matplotlib.colors?import?ListedColormap def?plot_decision_regions(X,?y,?classifier,?resolution=0.02):???? ????marker?=?('s',?'x',?'o',?'v')???? ????colors?=?('red',?'blue',?'lightgreen',?'gray',?'cyan')???? ????cmap?=?ListedColormap(colors[:len(np.unique(y))])???? ????x1_min,?x1_max?=?X[:,?0].min()?-?1,?X[:,?0].max()???? ????x2_min,?x2_max?=?X[:,?1].min()?-?1,?X[:,?1].max()???? ????#將x1、x2最大最小值通過arange函數得到的向量,擴展成兩個二維矩陣???? ????xx1,?xx2?=?np.meshgrid(np.arange(x1_min,?x1_max,?resolution),?np.arange(x2_min,?x2_max,?resolution))???? ????#預測???? ????Z?=?classifier.predict(np.array([xx1.ravel(),?xx2.ravel()]).T)?#ravel還原成單維向量???? ????#繪制???? ????Z=?Z.reshape(xx1.shape)?#將Z轉換成與xx1一樣的二維數組???? ????plt.contourf(xx1,?xx2,?Z,?alpha=0.4,?cmap=cmap)?#在兩組分類結果中間畫分割線-->必須線性可分???? ????plt.xlim(xx1.min(),?xx1.max())???? ????plt.ylim(xx2.min(),?xx2.max())???? ????for?idx,?cl?in?enumerate(np.unique(y)):???????? ????????plt.scatter(x=X[y==cl,?0],?y=X[y==cl,?1],?alpha=0.8,?c=cmap(idx),?marker=marker[idx],?label=cl) ???????? ???????? ppn?=?Perceptron(0.1,?10) ppn.fit(X,?y) plot_decision_regions(X,?y,?ppn,?resolution=0.02)
查看全部 -
自適應線性神經元的分類算法
class?AdalineGD(object): ????def?__init__(self,?eta=0.1,?n_iter=50): ????????self.eta?=?eta ????????self.n_iter?=?n_iter ????def?fit(self,?X,?y):?#與感知器最大的不同 ????????self.w_?=?np.zeros(1?+?X.shape[1]) ????????self.cost_?=?[]?#成本函數:判斷改進效果,使其不斷減小 ????????for?i?in?range(self.n_iter): ????????????output?=?self.net_input(X) ????????????errors?=?y?-?output ????????????self.w_[1:]?+=?self.eta?*?X.T.dot(errors) ????????????self.w_[0]?+=?self.eta?*?errors.sum() ????????????cost?=?(errors?**?2).sum()/2.0 ????????????self.cost_.append(cost) ????????return?self ????def?net_input(self,?X): ????????return?np.dot(X,?self.w_[1:]?+?self.w_[0]) ????def?activation(self,?X): ????????return?self.net_input(self,?X): ????def?predict(self,?X): ????????return?np.where(self.activation(X)?>=?0,?1,?-1) #運行算法??????????????? ada?=?AdalineGD(eta=0.0001,?n_iter=50)?#學習率越低,每次權重的改進越精確;迭代次數越大,優化的結果越準確。 ada.fit(X,?y) plot_decision_regions(X,?y,?classifier=ada)?#預測數據輸入到神經網絡后預測 plt.title('Adaline-Gradient?descent') plt.xlabel('花莖長度') plt.ylabel('花瓣長度') plt.legend(loc='upper?left') plt.show() plt.plot(range(1,?len(ada.cost_)+1),?ada.cost_,?marker='o')?#檢測改進效果 plt.xlabel('Epochs')?#自我迭代的學習次數 plt.ylabel('sum-squard-error')?#做出錯誤判斷的次數
查看全部 -
感知器的激活函數是步調函數,當輸入數據大于閾值-->1,小于閾值-->0.
自適應線性神經元的激活函數是直接將輸入數據與參數相乘后求和:w0+x1w1+x2w2+……,并且會將計算結果與輸入結果進行比較,若不一致,會根據給定結果動態調整參數-->漸進下降法。
漸進下降法:
和方差函數(U型),當函數對w求偏導后得到的切線斜率小于0(大于0)-->增大(減?。┫鄳窠浽獏祑值
查看全部 -
數據解析和可視化
import?pandas?as?pd import?matplotlib.pyplot?as?plt import?numpy?as?np df?=?pd.read_csv(file,?header=None) y?=?df.loc[0:100,?4].values y?=?np.where(y?==?'Iris-setosa',?-1,?1) X?=?df.iloc[0:100,?[0,?2]].values plt.scatter(X[:50,?0],?X[:50,?1],?color='red',?marker='o',?label='setosa') plt.scatter(X[50:100,?0],?X[50:100,?1],?color='blue',?marker='x',?label='versicolor') plt.xlabel('花瓣長度') plt.ylabel('花莖長度') plt.legend(loc='upper?left') plt.show()
數據分類(將預測的數據輸入到神經網絡中,以圖形方式繪制)
from?matplotlib.colors?import?ListedColormap def?plot_decision_regions(X,?y,?classifier,?resolution=0.02): ????marker?=?('s',?'x',?'o',?'v') ????colors?=?('red',?'blue',?'lightgreen',?'gray',?'cyan') ????cmap?=?ListedColormap(colors[:len(np.unique(y))]) ????x1_min,?x1_max?=?X[:,?0].min()?-?1,?X[:,?0].max() ????x2_min,?x2_max?=?X[:,?1].min()?-?1,?X[:,?1].max() ????#將x1、x2最大最小值通過arange函數得到的向量,擴展成兩個二維矩陣 ????xx1,?xx2?=?np.meshgrid(np.arange(x1_min,?x1_max,?resolution),?np.arange(x2_min,?x2_max,?resolution)) ????#預測 ????Z?=?classifier.predict(np.array([xx1.ravel(),?xx2.ravel()]).T)?#ravel還原成單維向量 ????#繪制 ????Z=?Z.reshape(xx1.shape)?#將Z轉換成與xx1一樣的二維數組 ????plt.contourf(xx1,?xx2,?Z,?alpha=0.4,?cmap=cmap)?#在兩組分類結果中間畫分割線-->必須線性可分 ????plt.xlim(xx1.min(),?xx1.max()) ????plt.ylim(xx2.min(),?xx2.max()) ????for?idx,?cl?in?enumerate(np.unique(y)): ????????plt.scatter(x=X[y==cl,?0],?y=X[y==cl,?1],?alpha=0.8,?c=cmap(idx),?marker=markers[idx],?label=cl)
查看全部 -
感知器的分類算法
import?numpy?as?mp class?Perception(objet): ????def?__init__(self,?eta=0.01,?n_ier=10):?#eta學習率,n_iter訓練權重向量的重復次數 ????????self.eta?=?eta ????????self.n_iter?=?n_iter ????#根據輸入樣本培訓神經元,X:shape[n_sample,?n_feature],?n_feature指神經元接收的電信號X數量 ????def?fit(self,?X,?y):? ????????self.w_?=?np.zero(1+X.shape[1])?#初始權重向量w_為0,括號里加一是包括激勵函數的閾值w0 ????????self.errors_?=?[]?#errors_用于記錄神經元判斷出錯次數,是個隊列 ???????? ????????for?_?in?range(self.n_iter): ????????????errors?=?0 ????????????for?xi,?target?in?zip(X,?y): ????????????????#權重更新 ????????????????update?=?self.eta?*?(target?-?self.predict(xi)) ????????????????self.w_[1:]?+=?update?*?xi ????????????????#閾值更新 ????????????????self.w_[0]?+=?update ???????????????? ????????????????errors?+=?int(update?!=?0.0) ????????????????self.errors_.append(errors) ????????? ????????def?net_input(self,?X):?#輸入電信號X與權重w的點積 ????????????return?np.dot(X,?self.w_[1:])?+?self.w_[0] ???????????? ????????def?predict(self,?X):?#預測分類 ????????????return?np.where(self.net_input(X)?>=?0.0,?1,?-1)
查看全部 -
感知器算法:要求數據線性可分
查看全部 -
????機器學習中用于對數據進行分類的算法:1.感知器。2.適應性的線性神經元。
????激活函數(也叫單元步調函數)
????機器學習的本質是模擬人的神經元對信息的處理方法。根據神經學的研究,神經元可以看做是一個簡單的帶有二進制輸出功能的邏輯電路門。
查看全部 -
自適應線性神經元
查看全部 -
神經元的數學表示
激活函數
向量的轉置與乘積的表達式
查看全部 -
課程大綱
查看全部 -
算法步驟總結
查看全部 -
感知器數據分類算法步驟
查看全部 -
真反饋電路查看全部
-
更新權重
查看全部
舉報
0/150
提交
取消